Tim

Tim Kent's Blog

BI conference- Day 3

So after a slight delay, here's a quick note on Day 3 and the conference in general.

We only caught the second half of the Keynote from Kurt DelBene of the Office group - investing in Excel - making it more rich - and what a great reporting tool Access is? errr Ok.  You can imagine the fun and games that goes on between the Office and SQL teams, with the PPS team stuck somewhere in the middle :)  More ridiculous staged customer Q&A style discussion followed about how great Office is - which saw a good chunk of the audience leaving again - seriously guys!

The first session of the day - Avoiding Common Mistakes with Analysis Services - nothing ground breaking here but the session was really well presented by Craig Utley, who in particular explained the concept of attribute relationships really well.  Plus a reminder of the extremely useful ignoreunrelateddimension property which stops users from seeing data where they've  selected a dimension that doesn't link to the measure group (think AdventureWorks, Reseller Sales with the customer dimension).

I ducked out of the last session to take a look round the shop - I nearly got myself a Microsoft embroidered denim shirt but I knew how envious the guys back at the Adatis office would be and they didn't have enough for the whole company.  I settled for two Microsoft golf shoe bags - Martyn and Sacha are gonna be so pleased on their birthdays!

So highlights for me:

  • Getting to meet some very cool people, particularly some of the PPS product team who took time out of their schedule to chat to us and were genuinely interested in what we had to say.
  • The BI power hour was great - some genius use of PPS as I reported in Day 2 - I hope we can get our hands on the "Perfoply" and "Girlfriend Management" Planning models.  They could be deal clinchers with certain potential customers just to show how varied the possible uses of PPS Planning are.

Lowlights:

  • Some very average sessions (mixed in with some very good ones mind).  The chalk talks were particularly bad logistically; over-subscribed, overcrowded and poor AV;
  • As reported by Dan English and others there was a serious lack of any PPS v2 (or even SP2) information - you'd think with Gemini being announced they'd want to at least hint about what was coming (particularly from a M&A perspective).  However, the "Upgrade to PerformancePoint" button in Gemini might suggest it's going to play a good part.  I have a feeling M&A V2 will be a huge step up.

Generally pretty good - I'll be back next year for sure.  But for now time to get back to the important business of day 5...

BI Conference - Day 2

First of all a quick correction following my post on day one where I had a bit of code name confusion.  Project "Madison" is the evolution of the DatAllegro acquisition, Project "Gemini" is the self service BI and "Kilimanjaro" is the code name for the (interim?) release in the first half of 2010 that will include "Madison" and "Gemini"

So what of day two? The Ben Stein keynote was interesting, though not a patch on the Michael Treacy's  from last year.  This was followed by an extremely cheesy and completely staged "Q&A" session with the platinum sponsors about where BI will be in 2020.  I didn't stay to hear the answers! I know they have to keep these sponsors happy but do MS really think people take any notice of this stuff?

I went for the MDM session next which was very heavy going and disappointing in that we still didn't get to see the product - though we may see it at next year's BI conference (same time next year)!

Got to meet Patrick Husting and a number of other PPS experts over lunch, and in the afternoon we got a chance to quiz some of the Gemini team on a few of the points mentioned yesterday.  I raised the topic of "AS hell" where users are creating random cubes all over the place and they had a good response: People are always going to do self service reporting in some way whether we like it or not, no matter how good the underlying data is, so why not do it in a controlled manner where everything is audited and logged and the IT team has full visibility of what is going.  There was a "Gemini" breakout session which showed the operations dashboard behind the scenes - very cool.  "Gemini" really looks impressive and has obviously already had a lot of effort put in, I have a feeling the usual version 1 worries may not be surfacing too much here - people are going to be desperate to get their hands on it!

Last session of the day for me was the  BI power hour which was very entertaining. A working Monopoly game in PPS with full analytics and Profit and Loss for each player, a girlfriend management Planning model (looking at how the seriousness of your relationship affects your future cashflow - ha ha) and two player battleships in Reporting Services - I want these guy's jobs! Also  a chance too briefly see MDM in action - it looks pretty good! Why they couldn't show it in the MDM session I don't understand...

Finally the conference party which was held at the the impressive QWest stadium, home of the Seattle Seahawks, with casino games, lots of x-box and Football and American Football out on the pitch.

Here's Jereminho scoring an absolute peach - back of the net!

backofthenet

Day 1 at the BI conference

So here we are in Seattle at the BI conference.  Day 1 and it's been great to catch up with some old faces and meet some new ones.  We were promised some big news today and as is being reported by Chris, Mosha, Marco and all, we weren't disappointed.  Both "Kilimanjaro" and "Gemini" look super-exciting for all of us in the MS BI world.  The former is the evolution of the DatAllegro acquisition  - we saw an SSRS report that ran in 10-15 seconds against (hold little finger to corner of mouth) one trillion relational records in a 150 terabyte database - impressive. 

Equally impressive is 20 million rows sorting and filtering in the blink of an eye!  The Gemini project (self-service BI) had two key features for me;  Firstly the in-memory storage (think TM1) that allows that sort of performance and secondly, and something that hasn't been widely commented on so far, the ability to publish your Excel reports to SharePoint/web (as XLCubed Web does now) at the click of a button.  The interface looked really good already and of course it's generating AS cubes behind the scenes.  It did raise the question of how it's all going to fit in with PPS V2 - hopefully the Office and SQL teams have been talking!

I think that a lot of BI professionals were probably initially thinking (like me) that this could be bad news for their careers but having taken it all in, I don't think that's the case.   Although it's going to have data cleansing abilities, this isn't going to be a replacement for data warehouses/marts.  In fact for the whole self-service BI thing to really take off, it's going to need really good data underneath it.   Microsoft's intention with the Gemini release is to "democratise" BI; The more people who get to use this type of concept, the better the data quality will have to be - you can't expect every department to have to clean their own data.

Allowing users to create and publish their own cubes and reports has a few warning signs as well - isn't this just excel hell without excel? Every department can create their own calculations, reports, cubes etc; We've been telling our clients for years that they really need one version of the truth - but now we're going to let each user make his own truth? It will certainly need some thought.

As Chris W mentions, it's likely to also still need technical resource to help users create complex calculations so we won't be out of a job just yet ;)

As for the rest of the day, some interesting sessions around PPS and SSRS were the order of the day for me.  Though disappointment in one respect as I found out that the PPS monitoring SDK would definitely not allow you to build a custom cascading filter - something that I was going to investigate.

More later this week....

PPS Monitoring - Missing Parameters using a Reporting Services report in SharePoint integrated mode

If you're running Monitoring with SP1 applied and working with a Reporting Services report in your dashboard that comes from a report server in SharePoint integrated mode, you may experience an issue where your parameters don't appear in the Report Parameters section.

image

There's a hotfix available for this issue upon request from MS support:

http://support.microsoft.com/kb/956553/en-us

PerformancePoint at the MS BI Conference

As Sacha has mentioned, he, Jeremy and I will all be attending the BI conference in less than two weeks time in Seattle.  Last year was a great chance to meet some of the BI community and attend some really good BI sessions but was (unsurprisingly) a little light on PPS content.  This year we're overloaded with sessions and it's going to be a tough choice of which to attend with at least one and often two or three potential sessions in every time slot.  A couple that have caught my eye:

Advanced Dashboard & Scorecard Design Techniques - just one of a number of sessions that Alyson Powell Erwin (Monitoring Product Manager) is presenting - those of you who read the Monitoring forums frequently will know that Alyson has the answer to most of the questions raised.

Office PerformancePoint Server 2007 Planning - Implementation Tips And Tricks - a chance to hear how other PPS teams have gone about projects.

I've no doubt Jeremy and Sacha will be going to see Michael Bower and Scott Sebelsky present Financial Consolidation with Office PerformancePoint Server 2007 as well as numerous other Planning specific sessions.

This is just a taster - take a look at the session list and plan your diary.  On top of this there's some great SQL BI sessions, as well a chance to get a look at MDM - hopefully they will filming lots of the sessions as last year.

We're also hoping to meet (in person) as many of what's becoming a really strong PPS community, do make sure you come up and say hello if you see us around.

PerformancePoint Monitoring Web Service - Part 3

In Part 1 we looked at the basics of connecting to web service and retrieving the metadata.  Part 2 looked at how to create a datasource first class object (FCO) and the quirks of the "well known" properties (thanks to Wade Dorrell for the feedback and for raising a usability bug for the next version).  In this post we'll look at how to create one of the other FCO's - in this case a report though as you can imagine the process for creating a KPI (the other "singular" FCO) is similar.

Once again, its not too hard to work out that we'll need to use the CreateReportView function and as before we'll create a ReportView object in our code that will get passed to the function once we've set up all the properties. 

    private static PPSM.ReportView CreateReport(PPSM.PmService mon)
    {
        //declare our reportview object
        PPSM.ReportView rep = new PPSM.ReportView();
        //set the guid for our object - this gets used as the primary key in the db table
        rep.Guid = System.Guid.NewGuid();

We know how to set the "well known" properties from part 2 and it's exactly the same process here (including the identifier) so it makes sense to create a re-usable function to do this for all our FCO's

    private static PPSM.BpmProperty[] SetWellKnownProperties(string name, string description, string owner)
    {
        //declare the three individual property types for the property array
        PPSM.BpmPropertyText bpmName = new PPSM.BpmPropertyText();
        PPSM.BpmPropertyLongText bpmDesc = new PPSM.BpmPropertyLongText();
        PPSM.BpmPropertyUser bpmOwner = new PPSM.BpmPropertyUser();

        //set some details for the properties
        bpmName.Text = name;
        bpmDesc.Text = description;
        bpmOwner.Login = owner;

        //initialise the properties array
        PPSM.BpmProperty[] wkprop = new PPSM.BpmProperty[3];

        //set the properties array with the three elements
        wkprop[0] = bpmName;
        wkprop[1] = bpmDesc;
        wkprop[2] = bpmOwner;

        //declate a GUID and set the unique id of each property
        wkprop[0].UniqueName = "8dd07d4d87794510afdb1f07664359bc_Element_Name";
        wkprop[1].UniqueName = "8dd07d4d87794510afdb1f07664359bc_Element_Description";
        wkprop[2].UniqueName = "8dd07d4d87794510afdb1f07664359bc_Element_Owner";

        return wkprop;
    }

Lets call the function and also set the type of report (in our case an Olap Grid).  All good so far....

        //set the well known properties
        rep.Properties = SetWellKnownProperties("My Report", "My Rep Desc", "Me");
        //set the type
        rep.TypeName = "OLAPGrid";

There are a couple more obvious properties that can be set (e.g. begin points and end points) but from here things get more complicated.  The majority of the important properties are stored in a single XML property of the ReportView object called CustomData.  This means that you'll have create and manipulate an xml document in your code that matches exactly the format required.

In practical use this may preclude using purely the web service to create FCOs completely from scratch.  As an alternative workaround you could start from a template report stored on the server and then use and update the custom data property from that template.  In reality of course you are unlikely to be automating the creation of reports completely from scratch - much more likely to just be updating the report datasource for example.  Just for the sake of proving the concept, this is what we'll do in our case.

The other workaround is to code against the object model in Microsoft.PerformancePoint.Scorecards.Client.dll rather than the WSDL of the web service.  This exposes the properties more clearly and is, as I understand, what happens behind the scenes in Dashboard Designer. IMHO this slightly defeats the object of a web service but I'm sure this will be rectified going forward.

As we've already gone through the process of creating a new reportview object we'll carry on down that path for this example.  Of course it might be simpler in reality to grab an existing reportview object using GetReportView and simply change the required properties and assign a new guid.  You use the CreateReportView function regardless of whether you are creating or updating - the existence of the object's guid in the underlying database table will define which happens. 

This function returns the customdata for an object

    private static string GetCustomData(PPSM.PmService mon, string ReportID)
    {
        Guid g = new Guid(ReportID);
        PPSM.ReportView rep = mon.GetReportView(g);
        return rep.CustomData;
    }

Next we need to declare an XMLDocument to store the customdata and call our function passing in the Guid of a known report.  I'm sure there are much cleaner ways to work with XML objects but it's Friday night!!

        //declare an xmldoc to hold the customdata
        XmlDocument custData = new XmlDocument();

        //you'll need to find the id of a similar type report from your PPS Monitoring database
        //load the custom data from ana existing report into an xmldocument
        custData.LoadXml(GetCustomData(mon, "d8c5ffbd-c01c-409f-ab1b-d0695227049b"));

Now obviously using the GUI of Dashboard designer it's easy too grab the data source by name.  In code you would need to come up with a clever way of doing this or just work with knowing the guid of the datasources you want to use for the report.  It's possible that you may have just created or updated a datasource as part of your application.

        //update the datasource elements of the customdata - quick and dirty hack!
        custData.DocumentElement["QueryData"].ChildNodes[1].InnerText = "3da20613-0a02-4e6f-ad17-6a42ec4d6b62";
        custData.DocumentElement["QueryState"].ChildNodes[2].InnerText = "3da20613-0a02-4e6f-ad17-6a42ec4d6b62";

        //and now lets pass the edited customdata to our new reportview object
        rep.CustomData = custData.InnerXml.ToString();

Finally pass to the web service

        //finally lets pass our modified reportview object to the webservice
        mon.CreateReportView(rep);

Updated project is linked below as usual and includes an example of the customdata xml for a reportview object.  Thanks as usual to Wade, Alyson and Tim at MS for taking the time to answer my DFQs!

PPSMWebService.zip (24.93 kb)

Type 2 SCDs - Calculating End Date when you only have Effective Date

Here's a quick one for creating an end date column when you only have an effective date column in your data source to work with.  The secret is to join the table to itself with a greater than join on the effective date.  You then use the effective date from your joined table to give you your end date.

SELECT     
    DT1.AccountID 
    ,DT1.EffectiveDate
    --Add a default end date for current record and subtract a day so the end date is one day before the next start date  
    ,ISNULL(DATEADD(d,-1, DT2.EffectiveDate), '31/12/2099') AS EndDate 
    ,DT1.Price
FROM         
    dbo.PriceTable AS DT2 
    RIGHT OUTER JOIN
        dbo.PriceTable AS DT1 
        DT2.AccountID = DT1.AccountID 
        AND
        DT2.EffectiveDate > DT1.EffectiveDate

Worth it's weight in....

Following on from Sacha's post re Adatis achieving MS Gold partnership I just wanted to:

a) blow our own trumpet a little bit more ;-); and

b) add my thanks to our customers, partners and especially the awesome Adatis team who have worked their a*ses off to make this possible!

Whilst Gold partnership isn't perhaps the rarest thing these days it's still a big deal for a small company like ourselves to achieve it.

I'll go now before I do a Gwyneth....sniffle...