Adatis BI Blogs

PerformancePoint Services 2010 – First impressions

We’ve been lucky enough (or we rather hassled enough people in MSFT for long enough!) to have been participating in the Office 2010 technical preview for the last couple of months but as it’s all been under NDA haven’t been able to blog about it.  This also means that we’ve had a chance to look round SharePoint 2010 and, in particular for me, PerformancePoint Services.  Nick Barclay has just done a series of posts about what’s new/improved/different in the new version so go there for the full list.  Here’s a quick round up of our first impressions: What looks good so far: SharePoint integration – Whilst Dashboard designer is still pretty much the same product for doing your, errr, dashboard design it’s no longer the admin and security tool as well.  This is all carried in SharePoint and in fact you have to set up a specific PPS site to do this.  A great deal of effort has obviously gone into this and which looks to have paid off. Security is all through SharePoint – no need to set up permissions twice! AS Conditional formatting now works. Decomp tree is back!! Measures can now be formatted independently. Workspace browser is now much more intelligently organised. Filter by value – you can now restrict rows\columns by value Dynamic dimension measures on scorecards- this was a bit of a workaround in 2007 as I’ve posted about previously. This now works properly Re-usable filters – Filters can now be shared and re-used across dashboards Disappointments: Lack of improvements for data visualisation – Very disappointing - other than the decomp tree, the visualisation side of PPS has changed little.  Still no real control over how your graphs look. The only other new item to be introduced is the Pie Chart!!!! oh dear.  Still no bar charts (and I mean Bars not Columns), no chart formatting options or (controllable) second y axis options that I can see :( Decomp tree is not a chart type but a right click option from a deployed report.  i Like the option to do this from any point in a report but would be nice to have both options. It’s still called PerformancePoint! – I have to admit I when I read another blog of Nick’s following the demise of Planning I didn’t entirely agree with him that it should be renamed.  Having spent the last ten months trying to explain to various IT departments that PerformancePoint is not the devil and that the Monitoring side has not been affected (usually to no avail) has changed my opinion completely.  As per Chris’s blog – ProClarity just seems to have disappeared – I know that was never what Monitoring was supposed to be but the lack of an ad-hoc cube browser is a huge oversight. Did I mention the lack of data visualisation improvements???? There’s lots more to discuss and there will be more to come over the next few weeks time allowing.  SharePoint 2010 looks pretty impressive…

Dynamic Dimension members on a PerformancePoint KPI

One of our customers had read Nick Barclay's post on dynamic sets in SSAS 2008 and was hoping that this would mean that you could create KPI's with dynamic dimension members.  Well the answer is yes and no.  It's already possible to do this in Monitoring (or should I say PerformancePoint Services) using custom sets in the scorecard designer (more on this below).  However in PPSM these sets are resolved at the point the Scorecard is rendered in the browser. This is fine as long as the members of your sets are not affected by the filters applied to your scorecard (member.children for example) - unfortunately the set does not get re-queried when you change a filter.   For instance if you were to create a set of your top 10 customers and drag that onto the rows of your scorecard, changing a time filter will not cause the KPI dimension members to change even if you've used time.currentmember in your set definition.  So you may end up displaying the Top 10 customers for the current month which may be different to  the Top 10 for the selected time period. Update: Please see Nick Barclay's comment below for a very neat solution to this issue using filter link formulas.  (Wish I'd thought of that!) Custom sets in the scorecard designer aren't the most obvious thing to use nor are they very user-friendly.  Your best bet is to use a tool like SQL Management Studio/Mosha MDX Studio to design a query that you know works then paste out the MDX for the set into the custom set formula editor.  You access this by dragging the Custom item in the Details pane onto the relevant position on your scorecard: Paste your set query in to the pop-up dialog.  For Example: TOPCOUNT( [Product].[Product Model Categories].[Subcategory].members, 10, [Measures].[Internet Sales Amount] ) You can then use the update button on the edit tab of the ribbon to see the results.  Unfortunately there's no way to edit the custom set once you've added it.  You have to delete the dimension members and then add a new custom set.

RIP PerformancePoint Planning

It's nearly a week since the announcement that shook the (PPS) world !  It's been a bit difficult to report on; generally the Adatis blogs try and offer solutions to problems we have encountered out in the real-world.  Now I could say something crass here about the real-world and the decision makers involved...but that would be childish right? If I was to offer up my feelings, they wouldn't be that far from Alan Whitehouse's excellent post on the subject.  If I had an ounce of class about me, it would be much more aligned with Adrian's poignant discussion opener, the one with the sharp-witted title, but alas.... We've spent the best part of the week speaking to customers, partners and Microsoft about what to do next.  The timing was choice - would you believe, we actually had three new PerformancePoint Planning phases kicking off this week, according to my project plan - I should be setting up Kerberos as we speak..  [There is always a positive right?] Some customers are carrying on regardless, they... ...already have planning deployments and are too far invested and dependent to back out at this stage or,  ...have a short-term view (That's not a criticism) and need a "quick" fix with a low TCO to get them through some initial grief.  (Typically these customers are going through rapid organisational change, or form part of a recent acquisition and, to help them see the wood from the trees during the transition, require short/sharp solutions) Other customers, with longer-term views, feel the product, or more importantly, the suitably skilled resource pool, will drain away far quicker than the life-span of the much touted Microsoft product support.  I have to agree - Fact - Adatis will not be employing or training anymore PerformancePoint Planning Consultants.  I doubt many other consulting firms will either. It's those customers with the longer-term view that are the ones currently in limbo - they are experiencing pain, they need pain relief, what should they do - wait and see what Office 14/15 offers? (There is talk of some planning functionality appearing in future Office versions - what truth there is in that..?). The Dynamics customers could wait for the resurrection of Forecaster - I do have information on good authority that they will be developing Forecaster to be closer, in terms of flexibility, to PPS Planning.  I had originally heard the opposite view in that Forecaster will be replaced with a cut down version of PPS Planning.  Either way, I'm sure some of the PPS Planning code-base will be utilised, which could end rumours of PPS Planning being 'given' to the community as some form of community/open-source arrangement.  An arrangement that is, in my opinion, a non-starter anyway, "Hey, Mr FD, We've got this great open-source budgeting and forecasting product we think you should implement!" - yeah right ! Another rumour (and mixed message) is that Service Pack 3 will contain some of the requested features that were earmarked for version 2 (After all, the code has already been written, right?) this rumour was actually started by Guy Weismantel in his Announcement Video.  However, the information I have since received, clearly states that Service Pack 3 will contain stability and bug fixes only - so which is it to be?  It's unlikely for a service pack to contain new features, but it's not unheard of; anyone remember the original release of Reporting Services?  That arrived as part of a service pack for SQL Server 2000. The burning question I cannot get answered is, have Microsoft actually stepped out of the BPM market for good?  We are told that Excel, Sharepoint and SQL Server provide BPM - I can't see, without Planning, how they can.  Short of hard-coded values, renewed Sharepoint/Excel hell, another vendor or bespoke planning solution, businesses can't set plans which have further reaching implications; effectively Planning's demise is also, effectively, shelving the Scorecard/KPI functionality from the M&A toolset too !  It will be interesting to see the new Monitoring & Analytics Marketing, will they still demo Strategy Maps and Scorecards, or will they now focus on Decomposition trees and Heat maps? Monitoring & Analytics may, in practice, just become Analytics.. I would have thought the cost of continuing to develop the product (even if it were a lemon, which Planning certainly wasn't)  is far less than the potential loss of revenue that Microsoft will face due not only to the loss of confidence by its customers (who are going to think twice about investing in any Microsoft product now, let alone a V1) but perhaps more significantly, the doors it opens to it's competitors who can offer a complete BI\BPM stack.  Planning was foot in the customer's door for BI - once you put planning in, the customer had already bought the full BI stack, and in most cases, our customers were wowed by what they could now achieve.  I suspect Cognos and SAP are still partying now!

Audit Trail in PerformancePoint Planning

I've noticed that the PPS Technet documentation has been updated recently to include an official Microsoft method to carry out auditing in PPS Planning. PPS will do some basic auditing out of the box, namely to the audit.log file on the server. This will automatically capture key events that occur on the server, e.g. creation of a model, updating of a dimension etc. The audit file does not, however, track changes to the model fact data. There has been a custom solution around for this for a while now - Sacha has written an excellent post here that details what you need to do in order to implement your own PPS audit trail. Like Sacha's method, the Microsoft approach involves creating auditing tables, which should then be populated by running a custom stored procedure. The stored procedure should then be scheduled on a periodic basis (e.g. hourly) to capture any new activity. This is a bit different to Sacha's method, where triggers are used to capture changes in real-time as they occur. In both cases the idea is to use something like Reporting Services to to view detailed auditing reports on your PPS data. One thing that did catch my eye on in the Technet documentation is a method to decode the binary 'change list' column that's held in the dbo.Submissions table. Whereas you can manually export the change list to a CSV file, there has historically been no way to take what's in the change list column and automatically decode it into a useful format. The following C# code will read the change list, and then insert it into your newly created auditing table: DataSet ds = new DataSet(); DataLayer dl = new DataLayer("PPSConnection"); ds = dl.ExecuteDataSetFromSQL("SELECT [SubmissionID]FROM [_AppDB].[dbo].[Submissions] s1 where s1.SubmissionID not in (select SubmissionID from [_StagingDB].[dbo].[SubmissionsAudited]) and s1.[Status] = 0"); string sSQL = ""; foreach (DataRow r in ds.Tables[0].Rows) { sSQL = @"INSERT INTO SubmissionsAudited(… ) VALUES("; // RETRIEVE THE CHANGELIST FOR THIS SUBMISSION DataSetWrapper dsw = new DataSetWrapper((Byte[])r["ChangeList"]); foreach (DataRow cldr in dsw.DataSet.Tables[0].Rows) { // SUBMISSION ROW DATA sSQL += r[0].ToString() + ", " + r[1].ToString() + ", " + r[2].ToString() + ", " + r[3].ToString() + ", '" + r[4].ToString() + "', "; // CHANGELIST ROW DATA foreach (object o in cldr.ItemArray) { sSQL += "," + o.ToString(); } sSQL += ")"; } // STORE EACH CHANGE TO THE AUDIT TABLE dl.ExecuteNonQuery(sSQL); Click here to view the Technet documentation.

PerformancePoint SP2 - Planning Fixes and a mini-feature

Jeremy has already announced the release of PerformancePoint Server SP2 and it's great to see that the PPS dev team hit their target release date !  I've spent a little commute time this morning checking out the documentation, admittedly I've initially focused on the Planning component and there are no great surprises (Tim has already told you about the new bits) but I have spotted what could arguably be described as a mini-feature surrounding form validation that I'm sure that will come in useful. As you would expect, previously released hot fixes have been packaged up into this service pack: 954710 Description of the PerformancePoint Server 2007 hotfix package: July 1, 2008 955432 Description of the PerformancePoint Server 2007 hotfix package: July 14, 2008 955751 Description of the PerformancePoint Server 2007 hotfix package: July 28, 2008 956553 Description of the PerformancePoint Server 2007 hotfix package: August 21, 2008 Plus fixes to issues not previously addressed: Excel Add-In Related You locally save and close a form in PerformancePoint Add-in for Excel. When you reopen the form, you are prompted to update the form. However, you expect that you are not prompted to update the form because the form is already up to date. In PerformancePoint Add-in for Excel, you open an offline form assignment. In the form assignment, you add a link to an external Excel worksheet in a cell. Then, you submit the changes to the PerformancePoint Planning Server database. However, when you reopen the assignment, the link that you added is not retained. After you install PerformancePoint Server 2007 Service Pack 1, you create a page filter in PerformancePoint Add-in for Excel. You have a user in PerformancePoint Server 2007 that does not have permission to the default member of the page filter. However, the user has permission to other leaf members in the page filter. When the user opens a report that uses this page filter, the user receives the following error message: Cannot render the <MatrixName> matrix. The server returned the following error: The <CubeName> cube either does not exist or has not been processed. However, in the release version of PerformancePoint Server 2007, the next member that the user has access to will be automatically selected for use in the page filter. You define data validation in a worksheet of Excel. However, you can still submit a form in PerformancePoint Add-in for Excel if data in the form is not validated. You have a matrix that is based on a large and complex model in PerformancePoint Add-in for Excel. You open the Select Filters dialog box to change a page filter for this matrix. When you click the Value column of the filter, the dialog box that displays the dimension members takes a long time to display. Business Rules Related After you migrate an application in PerformancePoint Server 2007 from one server to another server, the order of user-defined business rules and system business rules in models is not preserved. You cannot use the datamember function in the ALLOCATE statement and in the TRANSFER statement. Consider the following scenario. You create an automatic rule that uses MdxQuery implementation or Native MdxQuery implementation in Planning Business Modeler. Then you submit changes to the source data that the rule uses from an assignment form. The submission causes the model to be reprocessed. Because model reprocess causes rules in the automatic rule set to be executed, you expect that the target data of the automatic rule will reflect the change by the form submission. However, after the model is reprocessed, the target data of the automatic rule does not reflect the change. Rule expression of system business rules uses dimension member names instead of dimension member labels in PerformancePoint Server 2007. Planning Business Modeler Related You have a model that contains many form templates and assignments. When you try to change objects in the model in Planning Business Modeler, Planning Business Modeler crashes. You create a member property of the Date data type in a dimension in PerformancePoint Server 2007. Additionally, you specify the Set value to Null option when you create the member property. When you retrieve the value of this member property, you obtain a value of 1899-12-31T00:00:00. However, you expect that you obtain a value of blank. You cannot schedule recurring jobs for a frequency that is less than an hour. When a user updates a business rule in Planning Business Modeler, the audit log file of PerformancePoint Server 2007 logs the user ID of the user that created the rule. However, you expect that the audit log file logs the user ID of the user that updated the rule. Consider the following scenario. You create a dimension that has no hierarchy in a localized version of PerformancePoint Server 2007. Then you perform one of the following operations: You run the bsp_DI_CreateHierarchyLabelTableForDimension stored procedure to create label-based hierarchy table for the dimension. You perform the Prepare the Staging DB operation in PerformancePoint Planning Data Migration Tool. In this scenario, you receive the following error message: A problem was encountered while attempting to connect to, or Execute BSP on, the specified Database For more information regarding this error please review the Application Event Log on the SQL Server for any "MSSQLSERVER ERRORS" and\or Please check that all parameters in the UI are correct and try again

What's New in PerformancePoint SP2?

Correction: My badly titled post suggested this was the list of fixes rather than new features. Unless the read-mes (Planning and Monitoring) are still being updated, there's no real surprises in terms of new features in Service Pack 2 for PerformancePoint Server.  More about the fixes in the Knowledge Base (KB958291 KB960815) Planning Addition/Change Description Support for Windows Server 2008 Hyper-V™ You can now use PerformancePoint Server 2007 SP2 with Windows Server 2008 Hyper-V. Hyper-V creates new opportunities for server virtualization. You can use Hyper-V to make more efficient use of system hardware and host operating system resources to reduce the overhead that is associated with virtualization. For more information, see the PerformancePoint Server 2007 Hyper-V guide. Monitoring Addition/Change Explanation/Description You can now use SQL Server 2008 with PerformancePoint Server 2007 SP2. Important: To use SQL Server 2008 with PerformancePoint Server, you must install PerformancePoint Server 2007 SP2 before you install SQL Server 2008. You can now use the Show Details action on PerformancePoint reports that use data that is stored in SQL Server 2008 Analysis Services. Show Details enables dashboard consumers to right-click in a cell or on a chart value and see the transaction-level details for that value. PerformancePoint Server 2007 with SP2 now supports Windows Server 2008 Hyper-V. Hyper-V creates new opportunities for server virtualization. You can use Hyper-V to make more efficient use of system hardware and host operating system resources to reduce the overhead associated with virtualization. For more information, see the PerformancePoint Server 2007 Hyper-V guide ( You can now use Dashboard Designer on a computer that us running .NET Framework 3.5 alongside .NET Framework 2.0. You must install .NET Framework 2.0 before you install .NET Framework 3.5. You can now use PerformancePoint Server with domains that have apostrophes in their names In previous versions of PerformancePoint Server, when a domain name included an apostrophe, the configuration tool failed for both Planning Server and Monitoring Server. Scorecard key performance indicator (KPI) queries are improved.   Timeout errors no longer occur with scorecard key performance indicators (KPIs) that use data that is stored in SQL Server 2005 Analysis Services. Time Intelligence Post Formula filters now display the correct number of days for each month. In previous versions of PerformancePoint Server, the calendar control for Time Intelligence Post Formula filters sometimes displayed 31 days for each month. This is no longer the case. Time Intelligence filters now work on scorecard KPIs that use data that is stored in Analysis Services In previous versions of PerformancePoint Server, some Time Intelligence expressions caused filters that were linked to KPIs to fail. For example, when a compound expression such as (Day-7:Day-1) was used in a Time Intelligence Post Formula filter and that filter was linked to a KPI, an error message occurred. In PerformancePoint Server 2007 PS2, single and compound Time Intelligence expressions work with KPIs that use data that is stored in Analysis Services.

Entering Dates in PPS Planning Assignments

In the recent PPS Planning projects that I've been involved in, the challenges have often been around subjects such as business rules, hence the often recurring theme of this blog. Recently the tables were turned though, as I was told by a user that they wanted to enter dates into a PPS assignment. I was initially a bit concerned that the Excel add-in may not be able to deliver here - after all its great at capturing numbers, but knowing the rigid structure of the fact tables, I couldn't see how it would manage to store a date. Then I remembered something from my VBA days many years ago - that is that Excel stores dates as a number from 30/12/1899, meaning in theory it should be possible to get dates working in PPS. Thankfully it is possible, as this post explains. Excel Setup The first step to get this working when designing your form template is to set the matrix to have a matrix style of 'none'. If you don't do this, then the built-in matrix styles will over-ride your formatting changes to the required cells. Speaking of formatting, the next step is to format the data entry cells that will contain dates, just using the standard Excel formatting window: Once these few simple steps are done, then the assignment will behave just like any other. As the date is stored as a number, the numeric representation of the date will end up in the fact table just as any other piece of data. Dates in Business Rules Once the numbers are in the fact table, we need to convert them to dates to use them in business rules in some way. We can't do much in PEL unfortunately, so the options are either NativeMDX or NativeSQL. As Analysis Services can pickup some of the VBA functions, it's possible to use the VBA DateAdd() function to convert the stored number back into a date. So in the example below, I'm using the DateAdd() function to convert the number to a date, before comparing the resulting date against another date using the VBA DateDiff() function: WITH MEMBER [Measures].[DateExample] AS VBA!DateAdd("d", [Measures].[Value], "30/12/1899") MEMBER [Measures].[DateDiff] AS VBA!DateDiff("d", [Measures].[DateExample], "01/07/1987") SELECT Descendants([Time].[Monthly].[Year].&[2008],,leaves) ON 0 FROM [Strategic Planning] WHERE ([Account].[Profit and Loss].&[5010], [Measures].[DateDiff], [Entity].[Divisions].&[5003]) Although the above is just a simple example, it should give you the idea of the kind of calculations that can be performed in Analysis Services. It's possible to use these functions via a NativeMDXScript or a NativeMDXQuery. It's a similar story with SQL, as it also has its own DateAdd() function, as shown in the simple select statement below: SELECT DateAdd(d, [Value], '30/12/1899') FROM dbo.[MG_Strategic Planning_MeasureGroup_default_partition] WHERE Scenario_memberid = 4 AND Account_MemberId = 5010 So it's a shame that PEL can't work with dates, but the fact that both the database engine and Analysis Services have a DateAdd function means that it's possible to use dates for logic in both definition and procedural business rules.

PerformancePoint Planning: Deleting a Custom Member Property - A Solution

I had a bit of a rant yesterday about the fact I have had to compromise naming member properties when I've inadvertently created them with the wrong data type.  As I mentioned, I found a Dimension attribute collection method in the Planning client assemblies that hinted that it might allow me to delete a member property so I decided to give it a go. Below is some really rough and ready C# code that actually does delete a dimension member property.  I will improve the code and probably add it in to my PPSCMD GUI interface as a 'feature pack' bonus at some stage, however, if you are in desperate need for the code to delete a member property, and you can't wait for PPSCMD GUI v0.2 or PerformancePoint Version 2 (I'm not sure which will come first !) the code is below (Use at your own risk !!) Note:  Replace "MyApp", "MyDimension", "MyAttribute", oh, and the server address, accordingly.. using Microsoft.PerformancePoint.Planning.Client.Common; using Microsoft.PerformancePoint.Planning.Bmo.Core; .. // Setup the PPS Application Metadata Manager ServerHandler serverHandler = new ServerHandler("http://localhost:46787"); MetadataManager manager = new MetadataManager(); manager.ServerHandler = serverHandler; manager.ServerHandler.Connect(); // Get the system metadata BizSystem system = manager.GetSystem(true); // Get hold of the PPS Application BizApplication ppsApp = system.Applications["MyApp"]; // Obtain the root model site from the application BizModelSite site = ppsApp.RootModelSite; // Obtain the dimension that contains the member property BizDimension dimension = site.Dimensions["MyDimension"]; // Obtain the member property BizDimensionAttribute attribute = dimension.Attributes["MyAttribute"]; // Check out the dimension manager.CheckOut(dimension.Id, dimension.ParentModelSite.Id); // Perform the delete dimension.DeleteDimensionAttribute(attribute, null); // Submit the change manager.SubmitModelSite(ppsApp.Id, dimension.ParentModelSite, Microsoft.PerformancePoint.Planning.Bmo.Interfaces.SubmissionType.Update); // Check in the dimension manager.CheckIn(dimension.Id, dimension.ParentModelSite.Id); Update:  I've since discovered that you can obtain an unsupported utility from Microsoft Support that reportedly does the same thing, doh ! Oh well, always nice to have the code ..J

PerformancePoint Planning: Deleting a Custom Member Property..

Update:  I've posted a solution to Deleting a Custom Member Property here I've done this countless times; I've created my perfectly named Custom Member Property when it suddenly dawns on me that I've forgotten to give it the right data type.  No problem, right?  Wrong!  From within PBM, can you change the data type?  No!  Can you delete the member property? No!  Can you rename the member property?  No! So, what are the options?  Well, you could wait for version 2 (I truly hope you can edit/delete member properties in V2!), you could hack the back end database in the vague hope of removing the member property safely, or, as I have been doing in the past, create a new member property with a less than perfect name and try not to clench teeth and fists every time I glance at the original. Well, I've had enough, and decided I'm going to take action. Strangely, the Microsoft.PerformancePoint.Planning.BMO assembly contains a method called DeleteDimensionAttribute on the Dimension attribute collection.  I wonder... Anyone tried?

New PerformancePoint Contoso Demo - Released

Amidst my write up of the first day of the Microsoft BI Conference, I mentioned a new planning demo suite was imminent, and I would post more information about the demos soon.  Well, as it has now been officially released (27th October) I can spill the beans... Taken directly from the PPS Planning Forum announcement, the demo.. .. consists of Planning and Financial Consolidation demo. It shows how the fictitious Contoso Group uses Microsoft Office PerformancePoint Server for planning, statutory consolidation and data analysis. Well, I'm proud to announce that Adatis, in the shape of my colleague Jeremy Kashel, designed and built the PerformancePoint Planning element of the suite.  The PerformancePoint Financial Consolidation element was conceived and developed by our friends at Solitwork of Denmark. The demo can be downloaded from here... ...and is part of the next 'All Up BI VPC' (Version 7). Great work guys!

NativeMDXQuery Business Rules in PerformancePoint Planning

Having posted about NativeSql business rules a while back, I though that I might as well cover NativeMdxQuery business rules also, especially as there isn't too much documentation available on the web for this subject. NativeMdxQuery is a rule implementation type that can be used with assignment rules in PPS Planning. Rather than writing PEL and having the compiler convert your PEL into SQL or MDX, the idea is that you write the raw MDX directly if you pick NativeMdxQuery. Why Native Rules? When when I posted about NativeSql, I mentioned a scenario or two where using NativeSql was very useful. The advantages to writing a NativeMdxQuery, however, are less obvious. This is especially the case when you consider that a)standard MDX rules in PPS Planning are less restrictive that the SQL rules, and b) that PEL is very efficient - you can write a very concise MDX PEL rule vs writing the equivalent in MDX itself. So what advantages are there? Although a fair amount of MDX functions are included in PEL, it's not possible to use all MDX functions/statements/operators, e.g. Case(), Item(). This is one situation where you might want to write a NativeMdxQuery. Also, the ability to filter and restrict data in a raw MDX statement is far more powerful than the options available in PEL. For example, you can use the Filter()/IIF functions in PEL, but they're quite slow and you're restricted as to where you can put them. If you use MDX, you can use both a WHERE clause to slice the query and/or a HAVING clause for filtering. Finally, when just writing queries from an SSAS cube, a technique that's sometimes used is to create query-scoped calculated members, by using the MDX statement WITH MEMBER. This allows you to have calculation steps in your MDX query, and can effectively be used in one sense as a temporary variable store for calculations. If you're trying to do complex calculations in PEL, you have to assign everything to the 'this' keyword. It's much cleaner to have a block of code where you can define any calculations that the main query depends on. This is what WITH MEMBER will let you do. How to Write a NativeMdxQuery The requirements for a business rule using NativeMdxQuery are that you need to write an MDX select statement, which specifies the target cells that will be written to. Unlike NativeSql statements, you do not need to handle how the data gets inserted - PerformancePoint handles all that for you, as long as you produce the MDX select statement. An example select statement that makes up the entire content of a NativeMdxQuery assignment rule is shown below. The comments show what the query is doing: --[Measures].[Last Year] just makes the rule a bit more readable WITH MEMBER [Measures].[Last Year] AS ([Measures].[Value], [Time].[Monthly].CurrentMember.Lag(12)) CELL CALCULATION queryCalc FOR --This specifices the cells that we are overwriting with an expression or value '([Measures].[Value], [Scenario].[All Members].[Scenario].&[1], {[Time].[Monthly].[Month].&[200901],[Time].[Monthly].[Month].&[200902]}, {[Account].[Profit and Loss].[Level 07].&[5010],[Account].[Profit and Loss].[Level 07].&[5009]}, [BusinessProcess].[Standard].[Level 06].&[8], [TimeDataView].[All Members].[TimeDataView].&[1], Descendants([Entity].[Divisions].[(All)].&[0], ,leaves), Descendants([Currency].[All Members].[(All)].&[0], ,leaves), Descendants([Product].[Product Category].[(All)].&[0], ,leaves))' AS --The 100 is the value that we are giving the cells above 100 --This is the select statement which cells will receive the value above SELECT NON EMPTY ([Measures].[Value], ({[Scenario].[All Members].[Scenario].&[1]}, {{[Time].[Monthly].[Month].&[200901],[Time].[Monthly].[Month].&[200902]}}, {{[Account].[Profit and Loss].[Level 07].&[5010],[Account].[Profit and Loss].[Level 07].&[5009]}}, {[BusinessProcess].[Standard].[Level 06].&[8]}, {[TimeDataView].[All Members].[TimeDataView].&[1]}, {Descendants([Entity].[Divisions].[(All)].&[0], ,leaves)}, {Descendants([Currency].[All Members].[(All)].&[0], ,leaves)}, {Descendants([Product].[Product Category].[(All)].&[0], ,leaves)})) --Ensure we only write to cells with a certain value by using HAVING HAVING ([Measures].[Last Year] > 100000) properties [Scenario].[All Members].Key , [Time].[Monthly].Key , [Account].[Profit and Loss].Key , [BusinessProcess].[Standard].Key , [Entity].[Divisions].Key , [TimeDataView].[All Members].Key , [Currency].[All Members].Key , [Product].[Product Category].Key ON COLUMNS FROM [Strategic Planning] --Filter on a dimension member property by using a WHERE CLAUSE WHERE ([Entity].[Region].&[North]) The points to note about the above statement are: You must connect to the correct cube for the current model; You don't need to include a cell calculation - but it's the way that Microsoft implement business rules that use MDX, and it's hard to see how you would get a rule to be of any use without it; You must include the member keys as properties, otherwise the rule will error. Conclusion Writing MDX is clearly not for all users of PerformancePoint, but does provide the ultimate in flexibility when compared to PEL. Most MDX queries written in PPS will use cell calculations. If you're not used to how these work, or you just want to save some time, remember that an easy way to get started is to use the debug button on an MDX PEL rule in PBM - this will output a query that is similar to the one shown above.

Day 1 at the BI conference

So here we are in Seattle at the BI conference.  Day 1 and it's been great to catch up with some old faces and meet some new ones.  We were promised some big news today and as is being reported by Chris, Mosha, Marco and all, we weren't disappointed.  Both "Kilimanjaro" and "Gemini" look super-exciting for all of us in the MS BI world.  The former is the evolution of the DatAllegro acquisition  - we saw an SSRS report that ran in 10-15 seconds against (hold little finger to corner of mouth) one trillion relational records in a 150 terabyte database - impressive.  Equally impressive is 20 million rows sorting and filtering in the blink of an eye!  The Gemini project (self-service BI) had two key features for me;  Firstly the in-memory storage (think TM1) that allows that sort of performance and secondly, and something that hasn't been widely commented on so far, the ability to publish your Excel reports to SharePoint/web (as XLCubed Web does now) at the click of a button.  The interface looked really good already and of course it's generating AS cubes behind the scenes.  It did raise the question of how it's all going to fit in with PPS V2 - hopefully the Office and SQL teams have been talking! I think that a lot of BI professionals were probably initially thinking (like me) that this could be bad news for their careers but having taken it all in, I don't think that's the case.   Although it's going to have data cleansing abilities, this isn't going to be a replacement for data warehouses/marts.  In fact for the whole self-service BI thing to really take off, it's going to need really good data underneath it.   Microsoft's intention with the Gemini release is to "democratise" BI; The more people who get to use this type of concept, the better the data quality will have to be - you can't expect every department to have to clean their own data. Allowing users to create and publish their own cubes and reports has a few warning signs as well - isn't this just excel hell without excel? Every department can create their own calculations, reports, cubes etc; We've been telling our clients for years that they really need one version of the truth - but now we're going to let each user make his own truth? It will certainly need some thought. As Chris W mentions, it's likely to also still need technical resource to help users create complex calculations so we won't be out of a job just yet ;) As for the rest of the day, some interesting sessions around PPS and SSRS were the order of the day for me.  Though disappointment in one respect as I found out that the PPS monitoring SDK would definitely not allow you to build a custom cascading filter - something that I was going to investigate. More later this week....

BI for IT Professionals using PerformancePoint

The PerformancePoint downloads page has been recently updated to include a framework on how BI can provided to the IT Department. Normally the IT Department would be helping out with supporting BI solutions, not actually being end users themselves - but it makes sense when you consider the kind of information that Operations Manager captures. As this video explains, the end goal is to create solutions that will allow effective monitoring the IT infrastructure. An example of the kind of the kind of dashboards that can be produced is shown below:   There is white paper and also a sample solution available for download to learn more.

Tracking PerformancePoint Planning Submissions Using Reporting Services

The standard operational reports that come with PerformancePoint will allow you to report on a variety of PerformancePoint admin related activities, such as cycles, assignments, forms, jobs and associations. I find that the assignments report is particularly useful - after all, finding out who has/hasn't submitted is an important part of any data-gathering exercise. Whilst it is useful, I do find that the assignments report is the one that admin users want changed, especially when a model site exists containing many cycles and assignments. Extra Functionality With a large PPS Planning implementation you can easily end up with many assignment instances, cycles and users. I've been involved in such an implementation recently, and, due to the large number of assignments, the admin user requested a bit more filtering capability than the out of the box assignments report provides. Also, the existing assignments report tells will tell you that user A has submitted their assignment, but it won't go into any detail about what the submission actually contained. E.g. did the user submit all their entities? For some users it is quite key to know what other users have been submitting - for one thing it makes navigation easier if as an approver you know exactly which department/entity to pick in the assignment filters. Examples By knowing which tables to use, you can write an SSRS report that provides the additional functionality mentioned above. The starting point is to get the base report query right. In my case, as I'm in a separate auditing database, the query goes inside a custom stored procedure, and is as follows: SELECT A.AssignmentId, C.CycleInstanceName, AD.AssignmentDefName, A.AssignmentName, U.UserId, U.UserName, ENT.Name AS EntityName, CUST.Name As CustomerName, CASE WHEN A.Status = 'partial' OR A.Status = 'Approved' OR A.Status = 'Submitted' THEN 1 ELSE 0 END AS Draft_Submitted, CASE WHEN A.Status = 'Approved' OR A.Status = 'Submitted' THEN 1 ELSE 0 END As Final_Submitted, CASE WHEN A.Status = 'Approved' THEN 1 ELSE 0 END AS Approved, Approve.UserName As Approver FROM dbo.Assignments A LEFT OUTER JOIN dbo.[MG_Planning_MeasureGroup_default_partition] Fact ON A.AssignmentID = Fact.AssignmentID LEFT OUTER JOIN dbo.AssignmentDefinitions AD ON AD.AssignmentDefID = A.AssignmentDefID LEFT OUTER JOIN dbo.CycleInstances C ON C.CycleInstanceID = A.CycleInstanceID LEFT OUTER JOIN dbo.D_Entity ENT ON ENT.MemberId = Fact.Entity_MemberId LEFT OUTER JOIN dbo.D_Customer CUST ON CUST.MemberId = Fact.[Customer_MemberId] LEFT OUTER JOIN dbo.BizUsers U ON U.UserID = A.ContributorUserId LEFT OUTER JOIN dbo.ApproverList AL ON AL.AssignmentID = A.AssignmentId LEFT OUTER JOIN dbo.BizUsers Approve ON Approve.UserID = AL.ApproverUserID You can figure out most of the tables to use by looking at a view called AssignmentsView within the application database. One thing that I have taken into account is assignment definitions. If you have large number of users completing an assignment, then the chances are that you will have set up an assignment definition that points at a business role or a submission hierarchy. You ideally want to be able to filter on the assignment definition to return all assignment instances that belong to that assignment definition. Therefore, in my case I have three filters for the report, but you could easily add more: The final view is a report that shows the status of the assignments returned by the filter, but also, when expanded, shows the the entities and customers that the contributor has submitted: The above is just a taster of what can be achieved. A couple of ways that it can be extended include: Integrating with Sacha's data auditing idea to provide detailed history on what values the contributor has changed; Including comments, annotations and deadlines.

PerformancePoint at the MS BI Conference

As Sacha has mentioned, he, Jeremy and I will all be attending the BI conference in less than two weeks time in Seattle.  Last year was a great chance to meet some of the BI community and attend some really good BI sessions but was (unsurprisingly) a little light on PPS content.  This year we're overloaded with sessions and it's going to be a tough choice of which to attend with at least one and often two or three potential sessions in every time slot.  A couple that have caught my eye: Advanced Dashboard & Scorecard Design Techniques - just one of a number of sessions that Alyson Powell Erwin (Monitoring Product Manager) is presenting - those of you who read the Monitoring forums frequently will know that Alyson has the answer to most of the questions raised. Office PerformancePoint Server 2007 Planning - Implementation Tips And Tricks - a chance to hear how other PPS teams have gone about projects. I've no doubt Jeremy and Sacha will be going to see Michael Bower and Scott Sebelsky present Financial Consolidation with Office PerformancePoint Server 2007 as well as numerous other Planning specific sessions. This is just a taster - take a look at the session list and plan your diary.  On top of this there's some great SQL BI sessions, as well a chance to get a look at MDM - hopefully they will filming lots of the sessions as last year. We're also hoping to meet (in person) as many of what's becoming a really strong PPS community, do make sure you come up and say hello if you see us around.

Dynamic Range, Time Property Filter = Empty Matrix - A bug?

I think I've found a bug in the way the Excel Add-In generates MDX under certain 'rolling' conditions.  The requirement I have is to be able to forecast at the day level for a rolling 6 months; starting from the current period (which is to be updated each week) running for a period of 180 days (~ 6 months) To prevent requiring 180 columns, a dimension property based filter must be available to select the month in which to forecast.  This will provide a more concise data entry form detailing up to 31 days of the selected month in which to add forecast values. My form is dimensioned up as follows: Dimension Position Employee Filter Time(Month) Filter (Dimension Property) Scenario Filter Location Rows Time (Day) Columns I set up the columns as a dynamic range to ensure that the forecast 'rolls' with changes in current period.  The range was set from current member id + 0 : current member id + 180.  [Current Period is set to 16th September 2008 - today). The simplified MDX that this produces is below: select { Ancestor([Time].[Base View].[MemberId].&[20080916], [Time].[Base View].[MemberId]).Lag(0) : Ancestor([Time].[Base View].[MemberId].&[20080916], [Time].[Base View].[MemberId]).Lag(-180) } * { [Measures].[Value] } on columns, { descendants([Location].[Standard].[All].[All Locations],,after) } on rows from ( select {[Time].[Month].[All].[September 2008]} on columns from [LocationPlan]) where {[Employee].[Employee].[All].[John Doe]} * {[Scenario].[All Members].[All].[Forecast]} The first element to notice is that the columns have been set to a range using ancestor at the member id level and lag to cover the 180 days: Ancestor([Time].[Base View].[MemberId].&[20080916], [Time].[Base View].[MemberId]).Lag(0) : Ancestor([Time].[Base View].[MemberId].&[20080916], [Time].[Base View].[MemberId]).Lag(-180) The next point to highlight is the sub=query that represents the selected time dimension property value (September 2008): {[Time].[Month].[All].[September 2008]} on columns from [LocationPlan]) When you run this in SSMS, the following data set is returned: The Locations appear on the rows, the days appear on the columns - exactly as required. By changing the sub-query filter to October 2008 - the next month in the range, and definitely covered by the -180 day lag (Not sure why the Lead function isn't used here?) - results in a problem, the results returned now are missing the day level columns: The root of this problem is the column expression - if you replace the column expression with a direct lag on the current period member the expected results are returned: select { [Time].[Base View].[MemberId].&[20080916].Lag(0) : [Time].[Base View].[MemberId].&[20080916].Lag(-180) } * { [Measures].[Value] } on columns, { descendants([Location].[Standard].[All].[All Locations],,after) } on rows from ( select {[Time].[Month].[All].[September 2008]} on columns from [LocationPlan]) where {[Employee].[Employee].[All].[John Doe]} * {[Scenario].[All Members].[All].[Forecast]} Now, the only workaround I can come up with is to build the form using a custom MDX formula so I reckon this warrants raising a bug on connect - which I've logged here:

Analysis Services Properties for PerformancePoint MDX Rules

One of the great features of PEL is that you can choose either a SQL or MDX implementation for your rules, depending on what you want to achieve. Whilst the MDX rules are much less restrictive than the SQL rules, they can sometimes run slower, depending of course on how your environment is set up. When the MDX rules do take a long time to run, it's possible that you might see this message: What has essentially happened within Analysis Services is that an object (e.g. a cube or dimension) has been processed and is waiting to commit. Unfortunately this is not possible whilst an existing query is running, so therefore AS waits for the query to complete. It will, however, only wait so long, which is defined by one of the Analysis Services Properties called ForceCommitTimeout. Once this threshold has been reached, then the offending query is canceled, resulting in the error message above. Finding the right balance for the Analysis Services ForceCommitTimeout and the PerformancePoint PAC 'OLAP Cube Refresh Interval' setting is key. If you have set PPS to re-process its cubes too often then you may well see the above message. On the other hand, if you set the ForceCommitTimeout too high, then queries executed whilst the cube is waiting to commit will be made to wait, meaning your big query will get though ok, but other users may see bad performance. Darren Gosbell has written an excellent post here that provides a thorough explanation of ForceCommitTimeout and other related properties.