Adatis

Adatis BI Blogs

Microsoft BI Conference - Day 1 - 6th October 2008

So, although later than the trail blazers, I thought I'd write up a brief note about day one of the Microsoft BI Conference.  The 'Kilimanjaro' announcements have been done to death although I've noticed a couple of crossed wires.  Kilimanjaro is NOT the next version of SQL Server - it sounds more like an interim release, whether that comes as part of a service pack or a new type of feature pack delivery method I guess we'll have to wait and see.  However it arrives, we have to wait until the first half of calendar year 2010. With regard to 'Gemini' I'm hoping they make the in-memory (column based storage?) engine part of the SQL Server engine proper, as this can then benefit PPS and any SQL Server dependent app, not just the 'Gemini' framework.  Imagine PPS-P data entry/reporting running in memory !  It's certainly a game-changer and it will be interesting to see where and how it's positioned.  I can't help thinking that it's verging on promoting data silos and 'multiple versions of the truth' and it wouldn't surprise me if it's misused by some customers.  "We don't need a data-warehouse, we'll just use Gemini".. Although Tim did quiz the team on this.   Having said all that, it's pretty damn exciting and will change the face of BI for both customers and BI implementers. The first breakout session I attended was a Chalk and Talk by the engaging Peter Bull on moving from the spreadsheet world to PerformancePoint Planning.  He outlined a suggested method for converting existing excel based 'business applications' into PerformancePoint models, he was open and honest about some of the shortcomings of the product but also brought our attention to the the built-in features that aid design and productivity. The following tips were core to the session: - Don't replicate the current Excel 'models'. - Use filters to reduce scope and size of input matrices. - Limit definition rules (Don't build cell by cell calculations) - Don't use flat hierarchies. - Don't assume all calculations need to be real time. - Performance test by cut and pasting MDX. Another Chalk and Talk followed, held by Michael Bower and Scott Sebelsky on using PPS-P for Financial Consolidation.  They discussed the consolidation functionality available in PPS-P and using a two model site application, walked us through the implementation using US GAAP (Corporate Model Site) and IFRS (EMEA Model Site). The demo, supporting white-paper, and a new planning demo will be available shortly and was shown off in the hands on labs at the conference.  I'll shortly be able to post more information on these new demos... My third session of the day effectively covered some elements, mainly Report Builder 2.0, of the SQL 2008 feature pack that is out later this month.  One of the features demonstrated Component Based Report building from a self-service perspective and did look quite slick.  The session was presented by the SSRS PM team and they had a clever way of obtaining feedback from the audience on what features they would like to see the most.  They handed out a crib sheet of features and asked us to allocate a $100 budget to each feature - they collected in the sheets and will use this as a basis on what features to focus on.  In addition to Component based self-service reporting, features such as Office Integration using the acquired Software Artisans technology, Notify Me, Interactive Reports and Rich Presentation were shown off to good effect. Steve Hoberecht and Srini Nallapareddy were next on my list, taking us through Advanced PPS Planning rules and calculations.  There was some good stuff - I always thought the fact the ALLOCATE statement appended data was a bug, but now I know why it does what it does and warrants a separate post.  Some other tips, particularly for definition rules, some new some old, were also presented: - Reduce Scope - Avoid Hard-coded member labels to avoid security restriction errors - Consider automatic rules/ scheduled assignments. - Rule order is important / Avoid infinite loops - Consider moving calcs to Excel - Consider input/Reporting models - Locate bad performing rules by commenting out the rules in BIDS and introducing the rules on by one (from within BIDS) rather than setting the active flag from within PBM as that is more tedious. The day was rounded off by a great steak, with the other UK BI partners at Ruths and Chris Steakhouse.

PerformancePoint at the MS BI Conference

As Sacha has mentioned, he, Jeremy and I will all be attending the BI conference in less than two weeks time in Seattle.  Last year was a great chance to meet some of the BI community and attend some really good BI sessions but was (unsurprisingly) a little light on PPS content.  This year we're overloaded with sessions and it's going to be a tough choice of which to attend with at least one and often two or three potential sessions in every time slot.  A couple that have caught my eye: Advanced Dashboard & Scorecard Design Techniques - just one of a number of sessions that Alyson Powell Erwin (Monitoring Product Manager) is presenting - those of you who read the Monitoring forums frequently will know that Alyson has the answer to most of the questions raised. Office PerformancePoint Server 2007 Planning - Implementation Tips And Tricks - a chance to hear how other PPS teams have gone about projects. I've no doubt Jeremy and Sacha will be going to see Michael Bower and Scott Sebelsky present Financial Consolidation with Office PerformancePoint Server 2007 as well as numerous other Planning specific sessions. This is just a taster - take a look at the session list and plan your diary.  On top of this there's some great SQL BI sessions, as well a chance to get a look at MDM - hopefully they will filming lots of the sessions as last year. We're also hoping to meet (in person) as many of what's becoming a really strong PPS community, do make sure you come up and say hello if you see us around.

Dynamic Range, Time Property Filter = Empty Matrix - A bug?

I think I've found a bug in the way the Excel Add-In generates MDX under certain 'rolling' conditions.  The requirement I have is to be able to forecast at the day level for a rolling 6 months; starting from the current period (which is to be updated each week) running for a period of 180 days (~ 6 months) To prevent requiring 180 columns, a dimension property based filter must be available to select the month in which to forecast.  This will provide a more concise data entry form detailing up to 31 days of the selected month in which to add forecast values. My form is dimensioned up as follows: Dimension Position Employee Filter Time(Month) Filter (Dimension Property) Scenario Filter Location Rows Time (Day) Columns I set up the columns as a dynamic range to ensure that the forecast 'rolls' with changes in current period.  The range was set from current member id + 0 : current member id + 180.  [Current Period is set to 16th September 2008 - today). The simplified MDX that this produces is below: select { Ancestor([Time].[Base View].[MemberId].&[20080916], [Time].[Base View].[MemberId]).Lag(0) : Ancestor([Time].[Base View].[MemberId].&[20080916], [Time].[Base View].[MemberId]).Lag(-180) } * { [Measures].[Value] } on columns, { descendants([Location].[Standard].[All].[All Locations],,after) } on rows from ( select {[Time].[Month].[All].[September 2008]} on columns from [LocationPlan]) where {[Employee].[Employee].[All].[John Doe]} * {[Scenario].[All Members].[All].[Forecast]} The first element to notice is that the columns have been set to a range using ancestor at the member id level and lag to cover the 180 days: Ancestor([Time].[Base View].[MemberId].&[20080916], [Time].[Base View].[MemberId]).Lag(0) : Ancestor([Time].[Base View].[MemberId].&[20080916], [Time].[Base View].[MemberId]).Lag(-180) The next point to highlight is the sub=query that represents the selected time dimension property value (September 2008): {[Time].[Month].[All].[September 2008]} on columns from [LocationPlan]) When you run this in SSMS, the following data set is returned: The Locations appear on the rows, the days appear on the columns - exactly as required. By changing the sub-query filter to October 2008 - the next month in the range, and definitely covered by the -180 day lag (Not sure why the Lead function isn't used here?) - results in a problem, the results returned now are missing the day level columns: The root of this problem is the column expression - if you replace the column expression with a direct lag on the current period member the expected results are returned: select { [Time].[Base View].[MemberId].&[20080916].Lag(0) : [Time].[Base View].[MemberId].&[20080916].Lag(-180) } * { [Measures].[Value] } on columns, { descendants([Location].[Standard].[All].[All Locations],,after) } on rows from ( select {[Time].[Month].[All].[September 2008]} on columns from [LocationPlan]) where {[Employee].[Employee].[All].[John Doe]} * {[Scenario].[All Members].[All].[Forecast]} Now, the only workaround I can come up with is to build the form using a custom MDX formula so I reckon this warrants raising a bug on connect - which I've logged here: https://connect.microsoft.com/feedback/ViewFeedback.aspx?FeedbackID=368206&SiteID=181

The Second Annual Microsoft BI Conference, 6th-8th October 2008 - Seattle

I was unable to make the first Microsoft BI Conference in Seattle last year but Tim made it, and reported that it was a great excuse for an extended weekend of partying and snowboarding in Whistler way to meet like minded people, learn the latest buzz and feel part of the BI Community. Well, this year I can make it and I will be joining Tim and Jeremy for the duration.  Bad luck for me though, it's too early for snow ! ;o) I've checked out the Tracks and Sessions and there is quite a few I'm up for.  They haven't published the schedules yet so whether I can jump tracks and still fit everything will remain to be seen but here are my picks: Business Value - Emerging Trends and Technologies in the Business Intelligence Market - The Path to Pervasive Performance Management - The ROI of Performance Management Clients and Applications Track - Advanced Office PerformancePoint Server 2007 Planning Business Rules and Calculations - Integrating Microsoft Dynamics Data for Planning, Analysis and Reporting in Office PerformancePoint Server 2007  - The 4th Annual BI Power Hour Customer Track - Master Data Management (MDM) Customer Roundtable - Successfully Implementing the Complete Microsoft BI Stack - The Evolution of “Difference Makers” at Energizer Holdings, Inc. Using Microsoft Business Intelligence Deployment & Best Practices Track - Designing High Performance Cubes in SQL Server 2008 Analysis Services - Mastering Master Data Management - Office PerformancePoint Server 2007 Planning - Implementation Tips And Tricks Partner Track - BI Unplugged - a Panel Discussion Platforms and Infrastructure Track - Optimizing Query Performance in Microsoft SQL Server 2008 Analysis Services - SQL Server 2008 & Beyond - SQL Server 2008 Integration Services New Features and Enhancements Phew!  Well, after that lot I'm definitely gonna need a drink, so if you fancy a bevvy, hunt me down, be great to hook up !

PerformancePoint Planning - Loading the account dimension members from CSV

More for my benefit this post; I can never remember the format for the CSV file to load the Account dimension.  Below is an example csv file in it's simplest form that should help get me started faster the next time I need it ! Label,Name,Description,AccountTypeMemberId 1000,Salaries Permanent Staff,,Expense 2000,Temporary Staff Workload,,Expense 4000,National Insurance Costs,,Expense 5000,Pension Costs,,Expense

An unknown server error occurred when submitting User

We recently had to turn a couple of demo boxes into a 2 client development environment for an overly aggressive PPS-P deadline.  Both demo boxes were connected to the network and both were PPS-P Servers.  We wanted to nominate a specific box as the PPS-P server for the build. Typically, the one we nominated, for some reason, would not allow us to add domain users to the Users tab of the PAC whereas, the other one would. We received the following (not particularly helpful) error message when attempting to add a domain level user through PAC.   The following items cannot be saved The total number of errors during submission is 1. An unknown server error occurred when submitting User - <domain>/<username> The server log may contain more information about this error.       In our environment both boxes were using LOCAL (but different) accounts as the Application Pool identity and the Planning Service Account Identity.  When we changed the local identity accounts over to DOMAIN level accounts we could add users on both boxes.  It appears that the PPS-P web service needs access to the domain controller to check whether the user entered actually exists.  In fact it does a little more than that as it retrieves the user SSID to store in the Planning database too.  Note: In addition to changing the user to a domain account we had to ensure that account existed in the local IIS_WPG (IIS Worker Process Group) also. But, why, when using local accounts, would one server allow the addition of domain users and the other not?  The answer was pass through authentication !  The server that worked under the local account was using an account and password combination that matched an account and password combination on the domain ! Nick Barclay posted about connection problems with PPS M&A a good while back and gave a great write up of how to overcome issues encountered.  Now that's worth a read if you are not familiar with IIS Application pools and identity accounts as the issue we encountered were related.

PerformancePoint Suggestion box

Inspired by Adrian's post on the TechNet Planning Forum here's my list of 'I wants' for PPS-P.  I've cross posted this to the forum discussion too.  A few of these have been suggested before but there are some new ones in there too - In no particular order: Application Migration Wizard There are several methods for doing this currently but nothing slick.  I was hopeful that the new Data Migration Tool was the answer to this but alas, no.  It only migrates data (don't get me wrong this is useful) - what we really need is an Application Migration Tool that migrates everything, both initial 'go-live' migrations but also 'update release' migrations.  It should support making a copy of the application on the same server too (With a different name of course!). Developer Edition The PerformancePoint Server components can only be installed on Windows Server.  This is a tad annoying for development and demo purposes.  Currently we have a Windows Server built laptop for demos and sales activities.  For development, when connecting to the server is not always feasible we have a couple of pre-built PPS-P 'development' hard disk drives that we use to swap out with our non-server development boxes and manually migrate to the official servers periodically.  It would be great if, like SQL Server, there was a 'Developer Edition' that could be installed on XP and/or Vista. Improved Support for Business Rule Re-Use There are several ways that Business Rule re-use could be implemented but initially I'd like to see a 'formula' tab where, for example, you can define global formulas and expressions that can be selected (re-used) within various scope statements.  Taking this a step further, how cool would it be if the global formulas were actually functions that accepted parameters so you could produce a smaller set of generic formulas that can be re-used throughout the application. Improved Rule Editing The editor is a little clunky at present.  We need something more akin to SQL/MDX Query Editor in Management Studio. Web-Based Data Entry Whether via Excel Services or some other 'pivot table' style entry form.  A couple of customers are banking on Microsoft providing this in the future.  We at Adatis would love to know if they've actually started something like this in Redmond as we'd be up for developing something in-house if not ! Extensible PEL Much like custom tasks can be developed in SSIS and stored procedures can be created for Analysis Services - It would add some real power if PEL keywords could be added via custom .NET assemblies that are registered with the PPS-P Server.  Not only can complex, vertical specific logic be encapsulated, intellectual property can be protected, opening up a third-party plug-in market. Process Cycle Independent Data Entry Forms It would really aid productivity if data entry forms can be created without the need to create a Process Management Cycle and associated assignment. Application/Model Templates Imagine a 'Retail Budgeting Template' or a 'Transportation Forecasting Template' and how that could potentially wipe weeks off an implementation.  This is a tall order due to the global nature of the product and the diversity of requirements within industry verticals so it might be better to make lower level component type templates that can be plugged in to a larger solution   e.g.  UK Tax and NI and Salary assumption models. Protect Sensitive Data Some data that is forecasted and budgeted is extremely sensitive, salary information for example.  Assignments should be marked as sensitive and as such should challenge the user opening it for some form of authentication.  Integrated security is not secure ! Spread Across Time Not having the ability to spread across time or entering data at non-leaf level (as you can with the other dimensions) results in more models than you would ideally choose, increasing complexity. Custom Spread Methods Even and Ratio Spread don't satisfy every client spreading requirement.  If custom spread logic could be developed on the server and exposed through the Excel add-in for specific assignments every requirement could be fulfilled. Unlimited Undo/Versioning Without the ability to copy applications, proper versioning and unlimited undo/redo (across sessions) would help.  Adopting the Visual Sourcesafe approach of versioning, pinning, branching and merging would greatly assist with development / deployment / testing cycles. Validation Centralised validation rules that can prevent invalid data from being submitted to the server would help reduce approval/review workload. Dimension Member Creation Excel Add-In support for dimension member creation would assist greatly for certain budgeting/forecasting scenarios.  Workflow and data collisions would need to be considered otherwise the data integrity could be jeopardised. Multiple Model Data Entry Allowing an assignments to capture data for multiple models would provide the ability to simplify the implemented process and provide greater support for, amongst others, assumption model based what-ifs. Submission Triggered Jobs It's common to want to run a job once all data has been submitted.  Currently the job has to be run manually or scheduled.  Allowing jobs to be triggered via data submissions would provide greater automated support for implemented processes.

PEL Business Rule Re-use and Pre-defined What If Scenarios

You’ve built a rule set to satisfy an original set of business requirements for calculating budget, and all is good with the world.  Then, a change order is raised and approved that requires, for want of a better description, a set of pre-defined what if* scenarios to apply to the calculated budget. The pre-defined what-if’s are essentially the same business rule calculations with a slight factor or variance applied so it makes sense to want to re-use the existing rules. Not an unreasonable request. What’s the best way to implement this change request without introducing the burden of duplicated rules and keeping your model clean and well structured? Background This is the rule set in question: These rules utilise an assumption model that contains values such as salary levels, training days, tax assumptions etc and calculate various budget lines based on headcount information collected from a variety of regional cost centres. The rules are pretty straightforward as the Salary rule outlined below shows: scope([Scenario].[All Members].[Budget],[Account].[Detail].[Salary Costs],Descendants([Role].[Role].[All Roles], 0, after),Descendants([Time].[Quarterly].[Year2008], 0, after),Descendants([Region].[Region].[Global], 0, after) ) ; this = ( (Model([Drivers]), [Account].[Detail].[Annual Salary],[Time].[Quarterly].CurrentMember.Ancestor([Time].[Quarterly].[Year]),[Role].[Role].CurrentMember,[Region].[Region].CurrentMember,[Scenario].[All Members].[Budget]) / 4 ) * ( [Account].[Detail].[Headcount], [Time].[Quarterly].CurrentMember, [Role].[Role].CurrentMember, [Region].[Region].CurrentMember,[Scenario].[All Members].CurrentMember ); end scope; Essentially, for each quarter, for each role within each region we are multiplying the appropriate annual salary from the assumption model (divided by 4 for quarters) by the appropriate headcount to give us the budget salary figure. New Requirement Our change order requires a view of the budget as if we were to increase or decrease the headcount for each role within each region by 30%. Our new scenario dimension will look like this: The last two members have been added to cope with this new requirement – you could argue that these members could exist in a separate ‘what-if’ style dimension that could then be applied to any of the scenarios independently. I actually prefer that approach, but for simplicity let’s just limit this explanation to the scenario dimension. Okay, we have our new scenario members, we now need to wire up some rules to calculate these alternate budgets. Design Our first problem is how we can make the existing rules generic across budget scenarios. We have the budget scenario factors (a +30% and a -30%) to work in somewhere. We could hard-code these values into the rules but that would require a rule set per budget scenario thus duplicating all rules. Nasty. We could store the factors in an assumption model against the appropriate budget scenario, this could then be referenced within the rule. That would potentially allow us to re-use the business rules. However, I’m not keen on this approach at all as I feel that the budget factors are fixed, scalar values and to store them in an assumption model is overkill and would potentially require much more implementation effort. So, what option(s) are we left with? Not many, in this situation I would be tempted to create a member property on the Scenario dimension that held the factor against the appropriate member. The business rules could then simply (yeah right, see later!) reference the member property and apply the factor. So, in the case of the main Budget scenario we would have a factor of 1, for Budget +30% Headcount we would store a factor of 1.3 and 0.7 would be set against Budget -30% Headcount. So, on the face of it you can then update each rule to multiply every reference of Headcount by the appropriate factor held in a member property. However, there is a problem. That problem lies with the PEL function ‘Properties’. Unlike the MDX equivalent, the PEL version always returns a string, irrespective of the data type you assign to the member property itself. You cannot multiply by a string value and PEL does not contain any data type conversion functions to overcome this issue. Implementation The workaround: Use a Native MDX implementation. That sounds worse than it is; I’m not suggesting that you re-write all the business rules in MDX, although that is certainly an option. Alternatively, you can add a single MDX business rule to the top of the rule set that re-calculates Headcount, using the value of the member property: scope ( {[Scenario].[All Members].[Scenario].&[402],[Scenario].[All Members].[Scenario].&[501]},Descendants([Time].[Quarterly].[Year].&[2008], 0, AFTER),[Account].[Detail].[Level 02].&[5001],Descendants([Region].[Region].[Level 02].&[5201], 0, AFTER),Descendants([Role].[Role].[Level 02].&[101], 0, AFTER) ); ([Measures].[Value])= ( [Scenario].[All Members].[Budget],[Time].[Quarterly].CurrentMember,[Account].[Detail].[Headcount],[Region].[Region].CurrentMember,[Role].[Role].CurrentMember ) * [Scenario].[All Members].CurrentMember.Properties('OffsetPercent'); end scope; Apart from the fact the MDX rule is very PEL like, notice the last line of the rule. Here the headcount for each role, region and quarter is multiplied by the Scenario member property OffsetPercent. With this in place, the subsequent rules will calculate the budget scenarios based on the appropriate revised headcount value. (Due to the natural solve order of the rules in the rule set). The final step is to update each of the rules scope to include each of the budget scenarios (leaving the main bulk of the rule in place, untouched). Here’s the updated version of the Salary rule we looked at earlier (Modification in green) scope ({[Scenario].[All Members].[Budget],[Scenario].[All Members].[BudgetPlus30PcntHCount],[Scenario].[All Members].[BudgetMinus30PcntHCount]},[Account].[Detail].[Salary Costs],Descendants([Role].[Role].[All Roles], 0, after),Descendants([Time].[Quarterly].[Year2008], 0, after),Descendants([Region].[Region].[Global], 0, after)) ; this = ((Model([Drivers]), [Account].[Detail].[Annual Salary],[Time].[Quarterly].CurrentMember.Ancestor([Time].[Quarterly].[Year]),[Role].[Role].CurrentMember,[Region].[Region].CurrentMember) / 4) * ([Account].[Detail].[Headcount], [Time].[Quarterly].CurrentMember, [Role].[Role].CurrentMember, [Region].[Region].CurrentMember); end scope; For completeness, our modified rules set will look like this:   Notice the new (MDX) rule ‘Headcount’, this will be calculated first, for each of the budget scenarios ensuring the subsequent rules use the appropriately factored headcount figure. The support for Business Rule re-use is limited in PPS-P v1 but as this example hopefully illustrates, with a little thought and application, re-use is occasionally achievable.  I hope there will be better support for re-use in future versions ! --------------------------------------- * Generally, what-if’s are used dynamically; you change a few numbers in a data entry worksheet, re-calculate the sheet and view the results, repeating the process as necessary. This is a great feature and really quite powerful, however, there are some situations where pre-defined what-if’s are required to easily and frequently show standard what-if scenarios.

New Microsoft Insider PPS Planning Blog

A great new PPS-P focused blog has been up and running for a good few weeks now.  Peter Eb., a developer on the PPS Excel add-in has started blogging some real gold with a mixture of definitions, 'howtos' and workarounds.  He's one of the resident experts on the Planning TechNet Forum and really knows his stuff ! Just as a taster, here are the headings of his last few posts: What does this PPS Planning workflow action mean? HOWTO: Select a time range that is dynamic (updates over time for a rolling forecast) HOWTO: Hide filters for dimensions that confuse users What does this status "xyz" mean for my assignments? What does the error "filter selection for dimension XYZ is not in the filter scope"? It's what Technical Blogging should be all about, brilliant!

Book Review: Planning with Microsoft PerformancePoint Server 2007

I've had this book for a while and have been meaning to publish a review for sometime.  I reviewed the companion book Monitoring and Analyzing with Microsoft PerformancePoint Server 2007 by the same authors, Adrian Downes and Nick Barclay of B(iQ) who have already established themselves as the authority on Microsoft PerformancePoint Server 2007; just like the companion book, this one too re-enforces that position.  As I have come to expect from these guys, the book is just crammed to the 250 page Rational Guide limit with essential information.  It achieves several objectives; it provides a overview of the theory of Performance Management, it introduces the reader to the Microsoft PerformancePoint Planning product and finally it holds your hand through the relatively steep learning curve of implementing a planning solution.  For those who have been involved with PPS-P since the early CTP days, or later, would, I'm sure, have had several frustrating sessions trying to understand how and what is happening.  I truly believe that this book would have removed much, if not all of that frustration. But, in the interests of providing a fair and balanced review I do have one gripe - It would not surprise me if Adrian and Nick had a couple of sleepless nights and some 'discussions' with the Rational editors on what chapters to print and what chapters to provide as bonus material on-line (Yes, just like the companion book there is a whole bunch of bonus material including additional chapters, sample downloads and source code).   The Process Management chapter is one of the online bonus chapters that I definitely feel should have been printed.  Process Management falls into the 'essential' category and you can't actually do any Performance Management without Process Management.  Of the printed chapters, the one that could be argued is better suited for online is the chapter concerning data integration, particularly when this chapter contains approximately 16 valuable printed pages of plain T-SQL script.  So be sure to read the online bonus essential material. I think the authors would agree; the book does not set out to cover the whole of the product, the product is far too big to cram into 250 pages but it is definitely attempting to be a 'get up and running quickly with the essentials' level book and to that end, my hair splitting gripe aside, it absolutely nails it.  Easy to read, perfectly structured and comprehensive enough to give you the confidence to chase that 'early adopter' opportunity. It was always going to be a challenge to write a Rational Guide on PerformancePoint Planning but the boys have come good once again.  I've added this book, along with the companion book on M&A, to our internal 'essential reading list' - Top job chaps !

Essential Adatis Blogs

I've been speaking to a couple of people recently that, although subscribe to my blog, are not aware of our PPS Planning Sharepoint Web Part that Tim has blogged about here and here.  This post is more of an awareness that we have other bloggers onboard and much of the content is of the 'essential' variety. Currently blogging are Tim, Martyn and Jeremy with our aggregated feed that includes the lot.

Share Calculations and Effective Ownership

In PerformancePoint Planning Server there are several pre-defined model types that satisfy different requirements. One of these model types is Financial Model with Shares Calculations and its primary use is for the financial consolidation of a group of companies where the ownership of the companies within the group is not 100%. Take the following international group of companies: Adatis (UK) is the ultimate parent (and in this case the holding company too) and own a 90% shareholding of Adatis (Italy) and Adatis (US). Adatis (Canada) is only 60% owned, although notice that Adatis (Italy) owns the remaining 40% of Adatis (Canada). Adatis (France) and Adatis (Switzerland) are both 50% owned by Adatis (Italy) who also owns 45% of Adatis (Germany). In order to work out the consolidated accounts of the Adatis (UK) Group, one of the first steps is to determine the effective ownership of each of the companies within the group. This needs to be done from the perspective of the holding company. What do I mean by effective ownership? I mentioned earlier that Adatis (Canada) was 60% owned directly by Adatis (UK) and 40% owned by Adatis (Italy). So you would be forgiven by thinking that Adatis (Canada) is 100% owned within the group albeit split between two of the companies. However, notice t hat Adatis (Italy) is only 90% owned. This means that Adatis (UK) only effectively owns 90% of the 40% of Adatis (Canada) plus the 60% it owns directly. Therefore, the effective ownership is 90% of 40% = 36% + 60% = 96%. Below outlines the effective ownership from the perspective of Adatis (UK) for the entire group. PerformancePoint Server ‘With Shares’ Financial Model contains a built in Shares Calculation job that should, in theory, determine the effective ownership for a group of companies. In order to test the PerformancePoint Shares Calculations you need to load data into two of the three additional measure groups that are automatically created in a ‘With Shares’ Financial Model. Shares Outstanding Each company needs a set of shares issued and the number of those shares that have voting rights. To keep everything simple I’ve issued all companies with 10000 shares with all shares having voting rights. Shares Owned For each of the parent companies the share ownership needs specifying. We have two parent companies, Adatis (UK) and Adatis (Italy). The shares owned have been set according to our company breakdown illustrated above. Adatis (UK) owned companies Adatis (Italy) owned companies Control and OwnershipWith the data loaded the shares calculation job can be executed. The results are posted to the third of the additional measure groups, Control and Ownership.It’s the POWN columns, highlighted, that contain the effective ownership and encouragingly this matches our expectation. So it looks like, under these simple conditions the Shares Calculation job does what it should. Incidentally, there are a few additional columns that are of interest. - The PCON columns are the percentage that will be used in the consolidation. - The PCTRL (Ultimate Percent Control) columns specify the controlling percentage. This is essentially controlled by the number of voting shares compared to shares owned and effective ownership. - The ‘Overridden’ equivalents can be used to override the results of the shares calculation job and are the actual values used in the consolidation process. The non-overridden columns exist to show the outcome of the calculation. These can be altered but will have no effect as they are not used in the consolidation process itself. - The final additional column represents the Consolidation Method. There is a simple formula that PPS uses to determine the Consolidation Method: o If the ultimate percent control (PCTRL) is >= 0.0 and < 0.2, the consolidation method is NONE. o If the ultimate percent control (PCTRL) is >= 0.2 and < 0.5, the consolidation method is Equity. o If the ultimate percent control (PCTRL) is >= 0.5 and <= 1.0, the consolidation method is Full. o If the entity is the holding company for the model, the consolidation method is Holding. It’s important to note that there is no automatic way of setting a proportionally consolidated company. The consolidation method can be changed and will need to be changed if, for example, you wanted to proportionally consolidate Switzerland, and/or France as opposed to the Full consolidation by default.

PerformancePoint Server Job Parameters

Just a quick tip to save some mouse clicks.  When you are setting parameters for period start/end type variables you only have to set the end period if it differs from the start period.  I used to think all parameters were mandatory. It's particularly handy for a Consolidation Job as there are 2 sets of periods to set.  It annoys me slightly that there is no Dynamic time selection available for Job parameters so you have to scroll through the time dimension to find a specific member whereas now I only have to do this twice, not 4 times !

Oooh - See the pretty report ;)

Users love pretty things! Especially upper management who have minions to actually make sense of the numbers behind the reports for them ;)  Out of the box Reporting Services reports look anything but pretty! Here's few tips to make your bosses happy :) Use a background image as a border for charts. The image behind the chart below is a simple png file of only 8kb in size.  To make this work you need to use a rectangle object with the background image set to your image and the chart dropped in the rectangle Use custom palettes for your chart series.  To do this set up a new table with two columns, one that relates to the key's in your data source and one with the hex codes of the colours you want to use.  Then in your recordset join the id to your data.  Once you are returning the hex value for each row to your RS dataset you can use this to define the series colour: 1) Right click on chart -> Properties 2) Select the data tab then edit the value 3) Select the appearance tab then the series style button 4) Select the fill tab then use an expression to 5) Set the color property to your dataset field 6) Click OK, OK, OK, OK (phew!) I've uploaded a sample project to show how this is done.  Use black borders and lines sparingly (silver is always a good one). Put RS in Sharepoint Integration mode - the toolbars and filters are much nicer and auto-hide (sort of!).  Make sure you consider the limitations of this mode first though. Avoid using 3d perspectives in your charts - this is the easiest way to make your charts fugly (IMO)!

PerformancePoint Planning Staging Fact Load Validation

PerformancePoint provides a staging database to load data into.  Data loaded can be validated within the staging database before synchronising to the main application. Below is a quick description of the validation steps performed - it could help feed into an ETL specification - if any of the 'rules' are broken validation will fail and the main application model cannot be synchronised. Foreign Key Checks All foreign keys must reference dimension members that exist. Leaf Level Checks Can only load leaf level facts unless the Business Process member is set to MANADJ or FXADJ. Header Accounts Facts cannot be loaded against accounts of type 'Header' Business Process Members The following Business Process members are calculated and cannot be loaded: - CONSOLIDATED - PREALLOC - POSTALLOC - PREELIM Time Data View Facts can only be loaded against the 'Periodic' dimension member. Currency Checks Input Currency must match Entity currency unless you are loading against the FXAdj Business Process in which case you can specify a different currency. Flow Checks The following Flow members are calculated and cannot be loaded: - FX - FXO - FXF - OPE - MVT - INT Account Type/Flow Checks No Flow is allowed on Income Statement/Profit and Loss and Non Financial accounts. Flow is required for Balance Sheet accounts. Intercompany Checks Entity, Intercompany and Account dimension members must all have appropriate Entity and Account types, that set the 'Intercompany' member property to true for all Intercompany facts

Book Review: Monitoring and Analyzing with Microsoft PerformancePoint Server 2007

Tim Kent already reviewed this book last month after polishing it off over a weekend.  Tim liked the book and gave a positive review.  However, I felt that it is difficult to gauge how good the book is as, to be fair, Tim is hardly a newbie when it comes to PPS Monitoring and Analytics.  I've focused my PPS time on the planning element and normally throw the M&A elements over Tims fence so I've not spent as much time with Monitoring and Analytics as I should have.  Tim has got wise and has installed a new, higher fence, therefore, I thought it was about time I put the book to the 'newbie' test. It's my first Rational Guide and I really like the concept.  The guides are limited to 250 pages so they have to be concise and to the point.  I like the 'pocket' size, the formatting and layout that I presume is consistent across all Rational Guides. Considering the book is penned by two authors, I couldn't tell.  The writing style is clear and consistent and pitched at the same level throughout the book.  The detail is very well written with forward and backward references to other parts of the book to seamlessly link everything together. The book is split into 5 parts, Introduction, Installation and Config, The Elements, Implementation and Management and finally Extras which at first I thought was just an index and an advert for the other related Rational Guides - wow, thanks, you shouldn't have - but as it happens, on the very last page is information on how to register your book to obtain more information such as extra chapters and samples that the authors, Nick Barclay and Adrian Downes, could not squeeze into the already packed 250 pages.  One perhaps obvious casualty of the page limit is ProClarity, in the words of the authors, "It is beyond the scope of this book to detail the functionality available in the ProClarity product suite..". The introduction covers a lot of ground; Performance Management, BI and the Microsoft BI Stack and PerformancePoint Server 2007 itself.  A whopping 20 pages of intro which does eat into the 250 page limit - I'm not sure whether the scope of the intro was specified by the Rational Guide editors or the authors wanted to widen the audience but I'm not convinced it needed all of it at the expense of product related material.  It will be interesting to see how much of it is re-used in the PerformancePoint Planning companion book. The main body of the book follows a 'theory and practice' type approach.  Firstly explaining, in detail, each subject or feature, following up with a step-by-step tutorial to implement what you've just learnt.  Each chapter and section tends to build on the previous one so it's worth reading from front to back rather than darting around chapters of interest and this is largely down to the nature of the product rather than any conscious decision I'm sure.  All in all, for me, this approach works; not only do I gain some background insight and theory behind the product, I also gain some touchy-feely experience that you can't glean from just reading alone. If you are a newbie wanting a book to smooth out the learning curve to get you up to speed quickly and easily, with not only the product but with the theory behind the product too, this book, in my opinion, is it.  It can't take you to guru status but it provides a solid framework to provide the initial stepping stone.  To sum up, rather than spend half of the 4-day, ~£1000, Microsoft Partner PPS course learning about Monitoring and Analytics, take a couple of days off, buy this book and use it on a VPC image of the MS BI Stack - I bet you'll learn more!

PerformancePoint Server and SQL 2008

So after many hours looking into this issue, it seems that the current release of PPS does not work on SQL Server 2008, and there seems to be no tricking it.  Should you manage to get round the install checks, while setting up the Database Connections to SQL and Analysis servers, the versions are checked once again, and the door is currently closed. In the mean time, SQL 2005 will still work!  SQL 2008 is still due to Launch SQL 2008 in February, however it now looks as if the RTM (Release to Manufacture) will now not be until Q3 of 2008 - so we are unlikely to see,even early adoption until July time.  Microsoft see  SQL server 2008 as a critical step forward towards their "broader vision for business intelligence", so I think we can be rest assured that PerformancePoint will be integrated before release date.  In the mean time I will update you with any further developments on this, and will continue testing with further CTP's and patches.

71-556 PerformancePoint 2007 Beta Exam - We only bl**dy did it !

Back on November 9th, the, I think, last available day it could be taken, Tim Kent and I took the Microsoft Office PerformancePoint Server 2007, Application Development Beta Exam (71-556) at the Prometric Testing facility hosted by Global Knowledge Training in Wokingham. Soon after, we were both invited to Seattle to attend the PPS Exam Preparation workshop to review the exam in readiness for the official release.  Disappointingly I couldn't make it due to other PPS related commitments, however, much to my disgust ☺ Tim went without me, and by all accounts had a blast on the slopes of Whistler meeting a like-minded bunch of people. Today we both found out we actually passed!  This means we don't have to re-take the officially released exam as we get an automatic credit for passing the beta version - nice ! I have not seen any official release information for the exam proper but it must be imminent now that we've received our results.  I'm not sure if this is fact but I have been told that Microsoft use the '71' prefix for Beta exams so we should expect the official exam to be numbered 70-556.  Anyone know if this numbering convention is true? Obviously my hands are tied by the exam NDA so I can't reveal any details about the content of the exam.  Already publicly available is the list of topics that the exam covers, this list, repeated below, does match my experience of the beta exam.  I'll stop there as I've probably already said too much !  Good luck if you go for it.  Oh yeah, happy new year ! Microsoft Office PerformancePoint Server 2007, Application Development Exam Topics Configuring Dimensions Configure properties of predefined dimensions Configure the time dimension Configure a member property Populate dimensions and member sets Update dimension members and member set, and member properties Configuring Models Create a PPS model Configure the current period Configure model and dimension usage within sites Transfer data between models Configuring Business Roles and Process Management Configure business roles Configure security for users Configure assignments Configure an approval structure Developing Business Rules Develop an assignment rule Develop a definition rule Develop an allocation rule Configure template rules Configure rule execution Building Excel Add-In Forms and Reports Build an Excel Add-In report Configure an Excel Add-In report as a form Configure report filters Format an Excel Add-In report Developing Dashboards Create a report by using the Dashboard Designer Add a data source connection Add a new Web page report Configure a KPI by using Dashboard Designer Create a scorecard Build a dashboard Installing PerformancePoint Server Install the monitoring and planning services Configure a PerformancePoint application Configure the PerformancePoint server Assign users to administrator roles