Sacha Tomey

Sacha Tomey's Blog

PerformancePoint Planning - Currency Conversion (Triangulation)

I've blogged about some of the financial consolidation elements of PerformancePoint planning before.  Most of the elements related to financial consolidation e.g. Share calculations, intercompany transactions etc are specific to consolidation and only implemented when performing some form of statutory consolidation.  Currency conversion, on the other hand, can be common to any element of PerformancePoint Planning whether it's a budgeting, forecasting, planning or, in fact statutory consolidation solution.

In it's simplest, out of the box, form, currency conversion is pretty straightforward.  You need an exchange rate assumption model to hold the exchange rates and you need a financial model without shares that contains a pre-defined currency conversion rule that utilses the exchange rate assumption model.  You run the pre-defined rule on a set of financial data and currency conversion is calculated for you.  There's little more to it than that.

This post looks a little (and I mean a little) deeper in to currency conversion and demonstrates the triangulation features contained within PPS-P.

Consider this set of exchange rates, taken from www.xe.com today:

image

On the face of it, we have a few rates that allow us to convert from British Pound to Japanese Yen, US Dollar to both British Pound and Euro, and finally a rate for converting Euro to Japanese Yen.

When the in-built currency conversion rule is run against, for example, a set of US financial data we can determine the British Pound equivalent.  The Net Profit/Loss of 110,500 USD, converted to British Pound at a rate of 53.5 British pence per dollar converts to 59,117.50 GBP - As you would expect:

image


What would happen if we were to run the currency conversion rule again but this time try and convert the US Dollar figures to Japanese Yen? 

The astute among you would point out that we don't have an exchange rate for converting US Dollar to Japanese Yen.  The more astute among you might say that we could determine the JPY figures by first converting to GBP and then converting the GBP figures to JPY.  The even more astute among you might suggest that you could calculate the Japanese Yen two ways, either by first converting to GBP and then to JPY or by first converting to EUR before converting to our desired JPY.

Who's right?  Well, all of you, kind of!

If I run the currency conversion job again but instead of selecting GBP as the reporting currency, I select JPY, I get the following result:

image

No conversion !  Why - well, we don't have an exchange rate set for converting USD to JPY!

This is where triangulation comes in.  With our current set of exchange rates there are two methods for converting USD to JPY.  Either through GBP or EUR.  You set which 'route' the triangulation takes by setting the model property 'Default Currency'.  By default this is not set, hence our failed conversion above.

image

However, if you were to set it to either GBP or EUR you will get some results, but there is a catch - the 'route' you take returns different results.  Incidentally, you don't have to redeploy the model when you change the property you just need to save the model.

Triangulate through GBP

110,500 USD at a rate of 0.535 equals 59,117.50 GBP.  Convert the GBP to JPY at a rate of 205.524 and that equates to 12,150,065.07 JPY.

image

Triangulate through EUR

110,500 USD at a rate of 0.535 equals 75,029.50 EUR.  Convert the EUR to JPY at a rate of 162.068 and that equates to 12,159,881.006 JPY.

image

The two 'routes' highlight a variance of 9815.936 JPY (about 90 USD)!  A triangular arbitrage !

So, two questions:

1) Which 'route' should you use?
That's a business decision and if it's going to be a common conversion then direct rates ought to be favoured over triangulation.

2) How do you exploit this model to work out guaranteed triangular arbitrage to make millions through currency trading?
I'm working on it !

The Second Annual Microsoft BI Conference, 6th-8th October 2008 - Seattle

I was unable to make the first Microsoft BI Conference in Seattle last year but Tim made it, and reported that it was a great excuse for an extended weekend of partying and snowboarding in Whistler way to meet like minded people, learn the latest buzz and feel part of the BI Community.

Well, this year I can make it and I will be joining Tim and Jeremy for the duration.  Bad luck for me though, it's too early for snow ! ;o)

I've checked out the Tracks and Sessions and there is quite a few I'm up for.  They haven't published the schedules yet so whether I can jump tracks and still fit everything will remain to be seen but here are my picks:

Business Value
- Emerging Trends and Technologies in the Business Intelligence Market
- The Path to Pervasive Performance Management
- The ROI of Performance Management

Clients and Applications Track
- Advanced Office PerformancePoint Server 2007 Planning Business Rules and Calculations
- Integrating Microsoft Dynamics Data for Planning, Analysis and Reporting in Office PerformancePoint Server 2007 
- The 4th Annual BI Power Hour

Customer Track
- Master Data Management (MDM) Customer Roundtable
- Successfully Implementing the Complete Microsoft BI Stack
- The Evolution of “Difference Makers” at Energizer Holdings, Inc. Using Microsoft Business Intelligence

Deployment & Best Practices Track
- Designing High Performance Cubes in SQL Server 2008 Analysis Services
- Mastering Master Data Management
- Office PerformancePoint Server 2007 Planning - Implementation Tips And Tricks

Partner Track
- BI Unplugged - a Panel Discussion

Platforms and Infrastructure Track
- Optimizing Query Performance in Microsoft SQL Server 2008 Analysis Services
- SQL Server 2008 & Beyond
- SQL Server 2008 Integration Services New Features and Enhancements

Phew!  Well, after that lot I'm definitely gonna need a drink, so if you fancy a bevvy, hunt me down, be great to hook up !

PerformancePoint Server Planning SP1 - Clear Changes After Workflow Action

There's a new workbook property that was introduced in PPS Service Pack 1.  The 'Clear Changes After Workflow Action' effectively splats the change list for the workbook once the assignment has been submitted (either draft or final).

The property can only be reached through the Report Properties dialog, and is at the workbook level:

image

 

 

 

 

 

 

 

 

 

 

 

 

This property defaults to false which, under certain circumstances can hinder performance. 

Whenever you change data on a matrix, the slice that you affected is saved to a change list.  You can view what's on the change list by choosing 'View -> Show Current Changes' from the PPS Add-In for Excel.

image

Here's an example change list; two budget accounts for the same time period and department have been updated to the included new values.

image

 

 

 

 

The default behaviour (and the behaviour prior to SP1) is that, for the life of the assignment, the change list is maintained for every cell that is updated  The change list is simply appended to, so you can imagine, on a large workbook with several matrices spanning several filter slices, the change list can become quite large.

Submitting the assignment effectively submits the change list for processing by the server, first updating/inserting the appropriate records into the fact table and subsequently re-processing the Analysis Services partition.  It follows then, that the larger the change list, the slower the submission process.

Before SP1, this forever growing change list issue was resolved with little user training.  As part of the submission process you would invite your users to manually clear the change list:

image

By 'Clearing Current Changes' you throw away the changes to the cells and have to rely on the data being safe and sound on the server.  This process helped keep the change list to a more manageable size thus gaining on the submission performance.

The new 'Clear Changes After Workflow Action' property in SP1, if set to true, will perform the 'Clear Current Changes' step for you automatically.  This helps keep the change list lightweight (providing of course, the user regularly submits).  However, as I have already implied, there is one issue to be wary of; with the property set to clear changes, if your submission fails, the change list is lost and there is a real danger of losing data.

Adatis Goes Gold!

We are proud to announce that Adatis Consulting has been awarded Microsoft Gold Certified Partner Status.  This achievement reflects our continued investment in the latest Business Intelligence and Performance Management technologies from Microsoft and is, in no small part, thanks to our customers, partners, and the outstanding team at Adatis.

PerformancePoint Planning - Loading the account dimension members from CSV

More for my benefit this post; I can never remember the format for the CSV file to load the Account dimension. 

Below is an example csv file in it's simplest form that should help get me started faster the next time I need it !

Label,Name,Description,AccountTypeMemberId
1000,Salaries Permanent Staff,,Expense
2000,Temporary Staff Workload,,Expense
4000,National Insurance Costs,,Expense
5000,Pension Costs,,Expense

PerformancePoint Planning - Data change tracking / audit trail

PerformancePoint Planning does contain auditing out-of-the-box.  However, it's not data level auditing but instead Metadata level auditing.  If you want to track when a user has been added, or a workflow cycle started then PPS-P keeps a track of all these changes plus a whole host of other system changes in an XML file located in the directory specified in the 'Audit file name' property of the 'Auditing' tab in PAC.

The auditing will answer questions such as:

"When was the staging database last synchronised?"
"Who re-started my assignment?"
"When and who changed these business rules?"

This audit file does contain and audit a lot, and one day, unless I'm beaten to it by Microsoft (hint hint!) I'll either write an XSLT file to transform the XML into a 'nice looking' HTML page, or build some routines to suck the audit files into a usable database where I can stick some SSRS reports over the top.  Until then, notepad will have to do.

What's missing out-of-the-box is the ability to audit data level changes.  Questions that are actually more pertinent to Business Performance Management and requirements under Sarbanes-Oxley (SOX), the Basel Accords and other, related regulations:

"Who increased the salary assumption for the North West sales force?"
"When were the revenue targets last increased?"
"How many times do the sales forecasts get amended before they are approved?"

This post attempts to detail a simple mechanism that could be utilised to implement a basic data change audit trail that could be used to answer all questions relating to data changes.  At this juncture I must point out that the approach described below entails making database level changes to the Planning Application database and therefore would result in an unsupported environment.  In reality you would leave the planning database well alone and perform these changes to a log shipped or equivalent synchronised copy of the database, well away from the main production planning application database.  There, you have been warned !

In this scenario I want to place an audit trail on a simple assumption model.  The assumption model contains five dimensions, Time, Account, Region(Entity), Role (Custom dimension) and Scenario.  I want to track all changes to data within this model.

image

To set up the auditing the following is required:

Audit Tables

Three new tables are required, these are essentially copies of existing Planning Application tables and are used to keep a history of inserts and updates.  I create these tables in an 'Audit' schema; this keeps them grouped together, and allows me to maintain the same name as the original.  The only difference between the two tables is the addition of a new primary key 'AuditId'.

  • MG_Drivers_MeasureGroup_default_partition
    This is the main fact table of the assumption model that will contain the audit trail.  A complete history of changes will be maintained in this table.
  • BizUsers
    BizUsers contains all current users of the planning application.  As users can be removed from the application it is necessary to audit this table to ensure changes by non-current users can be forever traced.
  • AssignmentsHistory
    Stores history information relating to an assignment.  Information such as when it was submitted (either draft or final).  The comments that were added by the submitter etc etc.  The key field contained in this table is the UserId that allows us to know where to point the finger of blame! (Figuratively speaking of course!).  Like the BizUsers table, when assignments are no more, the associated history records are purged too.  Auditing this table maintains the history.

You could argue that each of the dimension tables also need auditing for cases when members are removed from the dimension in the future.  I can't argue against this and depending on your business process and application you might be quite right !

Example Audit Table Create Statement:

SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [Audit].[AssignmentsHistory](
    [AuditID] [int] IDENTITY(1,1) NOT NULL,
    [AssignmentID] [bigint] NOT NULL,
    [ChangeDatetime] [datetime] NOT NULL,
    [CreateDatetime] [datetime] NOT NULL,
    [UserID] [bigint] NOT NULL,
    [Action] [nvarchar](100) NOT NULL,
    [Comments] [nvarchar](2000) NULL,
 CONSTRAINT [PK_AuditAssignmentsHistory] PRIMARY KEY CLUSTERED 
(
    [AuditID] ASC
)WITH (PAD_INDEX  = OFF, STATISTICS_NORECOMPUTE  = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS  = ON, ALLOW_PAGE_LOCKS  = ON) ON [PRIMARY]
) ON [PRIMARY]

Triggers

Now, one of my colleagues in particular will shoot me down for this as, for years I've been strongly and completely against triggers.  They cause no end of pain when performance tuning and debugging line of business application databases.  I always argue that all trigger logic can, and should, reside in (transaction based) stored procedures that actually perform the data updates.  I actually do still stand by this, but, in this situation we don't really have general access to the data update stored procedures so, although I'm actually breaking one of my own 10 commandments, my hand is forced.

For each of the original tables a trigger is required to update the appropriate audit table.  To make the reporting easier I create a trigger for Update and Insert, this means that the most recent version of the record is in the audit table too.  (C'mon, storage is cheap !)

Example Original Table Trigger Create Statement:

SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO

CREATE TRIGGER [dbo].[trig_AssignmentsHistory_Audit] ON [dbo].[AssignmentsHistory]
FOR UPDATE, INSERT 

Not For Replication

AS

Set NoCount On

INSERT into Audit.AssignmentsHistory(
    AssignmentID, 
    ChangeDatetime, 
    CreateDatetime, 
    UserID, 
    Action, 
    Comments)
Select
    AssignmentID, 
    ChangeDatetime, 
    CreateDatetime, 
    UserID, 
    Action, 
    Comments
From Inserted

View

In order to report the audit trail, a simple view is required to show the changes over time of the values held in the fact.  This view is primarily based on the audit fact table and resolves all the dimension member ids to give you the dimension member names. 

Example Audit Trail View: (This view is specific to my fact table.  It will need modification for each fact table).

SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
ALTER VIEW [Audit].[AuditReport]
AS
SELECT
    AuditFact.ChangeDatetime,
    dbo.D_Scenario.Name AS Scenario, 
    dbo.D_Account.Name AS Account, 
    dbo.D_Entity.Name AS Region, 
    dbo.D_Role.Name AS Role,
    (SELECT TOP 1 UserName FROM Audit.AssignmentsHistory ah INNER JOIN Audit.BizUsers bu ON ah.UserID = bu.UserId WHERE AssignmentId = AuditFact.AssignmentId) ChangedBy,
    AuditFact.Value
FROM         
        Audit.MG_Drivers_MeasureGroup_default_partition AS AuditFact 
            INNER JOIN dbo.D_Role ON dbo.D_Role.MemberId = AuditFact.Role_MemberId 
            INNER JOIN dbo.D_Account ON dbo.D_Account.MemberId = AuditFact.Account_MemberId 
            INNER JOIN dbo.D_Scenario ON AuditFact.Scenario_MemberId = dbo.D_Scenario.MemberId 
            INNER JOIN dbo.D_Entity ON AuditFact.Region_MemberId = dbo.D_Entity.MemberId

The output of the view above will detail each of the changes made to each of the facts over time when queried like this:

SELECT
    Scenario, 
    Account, 
    Region, 
    Role, 
    ChangedBy,
    ChangeDateTime,
    Value 
FROM 
    [Audit].[AuditReport] 
WHERE 
    Scenario = 'Budget' AND
    Account = 'Annual Salary' AND
    Region = 'South West' AND
    Role = 'Assistant Manager'
ORDER BY 
    ChangeDateTime DESC

Resulting in the following output (most recent first):

image 

Notice the HR Director set the initial salary value for the South West Assistant Manager, the rather indecisive Finance Director made a few amends for it to be later adjusted again by the HR Director.  PerformancePoint Planning Data Change Tracking !

Possible Enhancements

The above method describes a very simple way of auditing data level changes in PPS-P.  This can be taken much further and I've included some possibilities below:

  • Develop a parameter driven SSRS report to report the audit trail for a dimension member combination
  • Create some generic scripts that accept the fact table as a parameter and create the appropriate audit tables, triggers and views.
  • Link up with the fact table annotations table and/or the submission comments (Held in the AssignmentsHistory table) to enrich the reporting.
  • The data volume will obviously grow over time - intelligent purging/archiving of audit records.
  • Update the view to handle certain conditions where adjacent audit trail records contain no apparent change in value (only the change date will be different - this can occur when a contributor submits, makes subsequent changes and submits again without first clearing current changes).

An unknown server error occurred when submitting User

We recently had to turn a couple of demo boxes into a 2 client development environment for an overly aggressive PPS-P deadline.  Both demo boxes were connected to the network and both were PPS-P Servers.  We wanted to nominate a specific box as the PPS-P server for the build.

Typically, the one we nominated, for some reason, would not allow us to add domain users to the Users tab of the PAC whereas, the other one would.

We received the following (not particularly helpful) error message when attempting to add a domain level user through PAC.

image

 


The following items cannot be saved


The total number of errors during submission is 1.
An unknown server error occurred when submitting User - <domain>/<username> The server log may contain more information about this error.

 

 

 


In our environment both boxes were using LOCAL (but different) accounts as the Application Pool identity and the Planning Service Account Identity.  When we changed the local identity accounts over to DOMAIN level accounts we could add users on both boxes.  It appears that the PPS-P web service needs access to the domain controller to check whether the user entered actually exists.  In fact it does a little more than that as it retrieves the user SSID to store in the Planning database too. 

Note: In addition to changing the user to a domain account we had to ensure that account existed in the local IIS_WPG (IIS Worker Process Group) also.

But, why, when using local accounts, would one server allow the addition of domain users and the other not?  The answer was pass through authentication !  The server that worked under the local account was using an account and password combination that matched an account and password combination on the domain !

Nick Barclay posted about connection problems with PPS M&A a good while back and gave a great write up of how to overcome issues encountered.  Now that's worth a read if you are not familiar with IIS Application pools and identity accounts as the issue we encountered were related.