Adatis

Adatis BI Blogs

Stop Reporting Services (SSRS) 2008 from overwriting custom Parameter Datasets

Frustrating little quirk when building reports in SSRS 2008 using Visual Studio 2008.  If you write a custom query against a parameter dataset, and then change a query that references the parameter, the custom query used by the parameter is overwritten and reset to the default. Now, you could keep a copy somewhere and replace it after every update, or, alternatively, use the <rd:SuppressAutoUpdate> tag by editing the rdl file directly. <Query> ... <rd:SuppressAutoUpdate>true</rd:SuppressAutoUpdate> </Query> I was close to raising this as a bug, when I found someone had beaten me to it: https://connect.microsoft.com/SQLServer/feedback/ViewFeedback.aspx?FeedbackID=417209

Creating a Custom Gemini/PowerPivot Data Feed – Method 1 – ADO.NET Data Services

There are already a few good Gemini/PowerPivot blogs that provide an introduction into what it is and does so there is no need for repetition.  What I haven’t seen are examples of how existing investments can be harnessed for Gemini/PowerPivot based self-service analytics. This series of posts focuses on various ways of creating Custom Data Feeds that can be used by Gemini/PowerPivot natively – Providing a direct feed from otherwise closed systems opens up new channels of analytics to the end user. Gemini/PowerPivot supports reading data from Atom-based data feeds, this post looks at a quick way of creating an Atom-based feed that can be consumed by Gemini/PowerPivot.  By far the simplest way to develop an Atom-based data feed is to employ ADO.NET Data Services in conjunction with ADO.NET Entity Framework.  With very few (in fact one and a bit!) lines of code, a data source can be exposed as a feed that Gemini/PowerPivot can read natively.  I am going to use the AdventureWorksDW sample hosted by a SQL Server 2008 R2 instance for this – obviously Gemini/PowerPivot natively reads SQL Server databases, so creating a custom feed over the top may seems a little pointless.  However, this technique may be useful for quick wins in several scenarios, including: - Preventing the need for users to connect directly to the underlying data source. - Restricting access to various elements of the data source (tables/columns etc) - Applying simple business logic to raw data. ADO.NET Data Services are a form of Windows Communication Foundation (WCF) services, and therefore can be hosted in various environments.  Here, I will simply host the ADO.NET Data Service inside an ASP.NET site. To create a Native Gemini/PowerPivot feed, you take seven steps: 1 - Create ASP.NET Web Application 2 - Create Entity Data Model 3 - Create the Schema 4 - Create the Data Service 5 - Load From Data Feed 6 - Create Relationships 7 - Test Step 1) Create ASP.NET Web Application I’m using Visual Studio 2008 here to create an ASP.NET Web Application. Step 2) Create Entity Data Model Add an ADO.NET Entity Data Model item to the project, these files have a .edmx extension and allow us to create a schema that maps to the underlying database objects. Step 3) Create the Schema We simply require a 1:1 mapping so will ‘Generate from Database’.  Incidentally, the ‘Empty Model’ option allows you to build a conceptual model of the database resulting in custom classes that can be optionally mapped to the database objects later. Create a Microsoft SQL Server connection to AdventureWorksDW2008. Select the appropriate database objects, I’ve selected the following tables: - DimCurrency - DimCustomer - DimDate - DimProduct - DimPromotion - DimSalesTerritory - FactInternetSales Once the wizard has completed, a new .edmx and associated cs file is created that respectively contain an Entity Relationship Diagram and a set of Auto Generated Classes that represent the database objects. Due to the way the Entity Framework handles Foreign Key Constraints we have to apply a workaround to ensure the Foreign Keys on the FactInternetSales table are exposed and brought into Gemini/PowerPivot.  A previous post Exposing Foreign Keys as Properties through ADO.NET Entity Framework walks through the workaround.     Step 4) Create the Data Service Add an ADO.NET Data Service item to the project. The service class inherits from a generic version of the System.Data.Services.DataService object, so we need to inform the compiler what class to base the generic object on.  We essentially want to base our Data Service on the class representing our newly created Entity Data Model.  The class name is derived from the database name, unless changed when the Entity Data Model was created, so in our case the class name is AdventureWorksDW2008Entities. The auto generated service class contains a ‘TODO’ comment that asks you to ‘put your data source class name here’.  The comment needs replacing with AdventureWorksDW2008Entities. The final step is to expose the resources in the Entity Data Model.  For security reasons, a data service does not expose any resources by default.  Resources need to be explicitly enabled. To allow read only access to the resources in the Entity Data Model the InitializeService method needs updating with a single line of code.  The code snippet below details the final class implementation, notice the AdventureWorksDW2008Entities reference at line 1 and the the explicit resource enablement at line 6. Code Snippet public class GeminiDataService : DataService<AdventureWorksDW2008Entities>     {         // This method is called only once to initialize service-wide policies.         public static void InitializeService(IDataServiceConfiguration config)         {             config.SetEntitySetAccessRule("*", EntitySetRights.AllRead);         }     } That’s all that’s needed, by default, ADO.NET Data Services conform to the Atom standard, so in theory the Service is ready to be consumed by Gemini/PowerPivot. Before we try, it’s worth giving the service a quick test, building and running the solution (F5) launches Internet Explorer navigating to the Service hosted by the ASP.NET Development Server. You are first presented with an XML document containing elements that represent database objects, you can further drill into the objects by amending the URL.  For example, if you want to see the contents of the DimPromotion table then append DimPromotion to the end of the URL: http://localhost:56867/GeminiDataService.svc/DimPromotion (Case sensitive) Note:  You may need to turn off Feed Reader View in Internet Explorer to see the raw XML (Tools->Internet Options–>Content->Settings–>Turn On Feed Reader View – make sure this is unchecked) As a slight aside, the URL can be further enhanced to, filter, top n rows, extract certain properties etc etc. Here are a couple of examples: URL Effect http://localhost:56867/GeminiDataService.svc/DimCustomer?$top=5 Return the top 5 Customers http://localhost:56867/GeminiDataService.svc/DimCustomer(11002) Return Customer with id 11002 http://localhost:56867/GeminiDataService.svc/DimCustomer(11002)/FirstName Return the First Name of Customer 11002 http://localhost:56867/GeminiDataService.svc/DimProduct(310)?$exapnd=FactInternetSales Returns Product with id 310 and all related Internet Sales Records Confident that the feed is working, we can now deploy the service, and start using the feed in Gemini/PowerPivot.  Step 5) Load From Data Feed Open up Excel 2010, launch the Gemini/PowerPivot Client (by selecting ‘Load & Prepare Data’) Select ‘From Data Feed’ from the ‘Get External Data’ section of the Gemini/PowerPivot Home Ribbon to launch the Table Import Wizard. Specify the Url from the ADO.NET Data Services feed created earlier, in my case: http://localhost:56867/GeminiDataService.svc as the 'Data Feed Url’ and click Next. Incidentally, you can use the majority of the enhanced Urls to, for example only select the DimProduct table should you so wish, however by specifying the root Url for the service you have access to all objects exposed by the service. From the Table Import Wizard Select the required tables, in my case I’ll select them all.  (You can optionally rename and filter the feed objects here too). Following the summary screen, the Gemini/PowerPivot Client then gets to work importing the data from the ADO.NET Data Service: Once completed, Gemini/PowerPivot displays all the data from all of the feed objects as if it came directly from the underlying database. Step 6) Create Relationships There is one final step before we can test our model using an Excel Pivot Table.  We need to create the relationships between the tables we have imported.  The Gemini/PowerPivot Client provides a simple, if a little onerous way of creating relationships, the ‘Create Relationship’ action on the Relationships section of the Home Ribbon launches the Create Relationship wizard: Each table needs relating back to the primary Fact table which results in the following relationships: Step 7) Test We are now ready to start our analysis, selecting PivotTable from the View section of the Gemini/PowerPivot Client Home ribbon creates a pivot table in the underlying Excel workbook attached to your custom fed Gemini/PowerPivot data model.         So, to allow fast access to, for example, potentially sensitive data, through Gemini/PowerPivot you can quickly build a custom data feed that can be consumed natively by the Gemini/PowerPivot Client data feed functionality.

HACK: Exposing Foreign Keys as Properties through the ADO.NET Entity Framework

First post for months; the PerformancePoint Planning announcement forced some redirection and rebuilding.  We’ve grieved, we’ve moaned, but at some point, you just have to move on. ----------------- I’m not a fan of hacks – it normally means you are doing something wrong, but in this case, where I’m after a quick win, I’ve had to work out and resort to a bit of a hack.  It actually looks like the issue I’m facing maybe addressed in Entity Framework v2 (Microsoft .NET 4.0) – so maybe it’s more of a workaround than a hack after all ;o) I’m using the ADO.NET Entity Framework and ADO.NET Data Services to expose a subsection of a database for consumption by Gemini.  In order to relate the exposed database objects together in Gemini, I need to apply this hack to ensure I have Foreign Keys available in my Gemini models to support creating the relationships.  By default, the Entity Framework exposes Foreign Keys as Navigation Properties rather than Scalar Properties.  Gemini does not consume Navigation Properties. Lets take the scenario where I want to create an Entity Framework Model based on the following tables from the AdventureWorksDW2008 sample database: -FactInternetSales -DimCustomer -DimProduct -DimSalesTerritory Step 1)  Identify the table(s) that contain Foreign Keys.  In this case FactInternetSales. Step 2)  Load those table(s) into the Entity Framework Model on their own.  This ensures the Foreign Keys are set as Scalar Properties.  If you load in all the tables at once, the Foreign Keys are not exposed as Scalar Properties. Step 3)  Load in the related tables. (DimCustomer, DimProduct, DimSalesTerritory) At this point a bunch of Navigation Properties would have been set up, along with relationships between the related tables but the trouble now is the project will no longer build.  If you try you receive the following error for each relationship: Error 3007: Problem in Mapping Fragments starting at lines 322, 428: Non-Primary-Key column(s) [CustomerKey] are being mapped in both fragments to different conceptual side properties - data inconsistency is possible because the corresponding conceptual side properties can be independently modified. Step 4) Manually remove the relationships between tables. Clicking on the relationship line on the diagram and hitting delete, removes the relationship. Step 5) Remove all Association Sets By editing the edmx file manually in a text or XML editor you need to remove all <AssociationSet>…</AssociationSet> occurrences from the <EntityContainer> section: <EntityContainer Name="AdventureWorksDW2008Model1StoreContainer">     <EntitySet Name="DimCustomer" EntityType="AdventureWorksDW2008Model1.Store.DimCustomer" … />     <EntitySet Name="DimProduct" EntityType="AdventureWorksDW2008Model1.Store.DimProduct" … />     <EntitySet Name="DimSalesTerritory" EntityType="AdventureWorksDW2008Model1.Store.DimSalesTerritory" … />     <EntitySet Name="FactInternetSales" EntityType="AdventureWorksDW2008Model1.Store.FactInternetSales" … />     <AssociationSet Name="FK_FactInternetSales_DimCustomer" Association="AWDW08.FK_FactInternetSales_DimCustomer">         <End Role="DimCustomer" EntitySet="DimCustomer" />         <End Role="FactInternetSales" EntitySet="FactInternetSales" />     </AssociationSet>     <AssociationSet Name="FK_FactInternetSales_DimProduct" Association="AWDW08.FK_FactInternetSales_DimProduct">         <End Role="DimProduct" EntitySet="DimProduct" />         <End Role="FactInternetSales" EntitySet="FactInternetSales" />     </AssociationSet>     <AssociationSet Name="FK_FactInternetSales_DimSalesTerritory" Association="ADW08.FK_FactInternetSales_DimSalesTerritory">         <End Role="DimSalesTerritory" EntitySet="DimSalesTerritory" />         <End Role="FactInternetSales" EntitySet="FactInternetSales" />     </AssociationSet> </EntityContainer> The project should now build, with the foreign keys exposed as Scalar Properties.  Obviously no inherent relationships exist, so this could be dangerous in certain applications.  For Gemini however, providing you setup the relationships manually, it works a treat.

PerformancePoint Server SP2

A good few months ago I wrote about SP1 for PerformancePoint and getting this working on SQL Server 2008.  There were a number of attempts, and with a great deal of hacking, I got an untested environment set up, however this was not very stable and never fully tested.  SP2 is due for release in December, and there are mentions of a Beta available on the web, although unavailable through MSDN, or Connect.  It seems that the two promises of SP2 are Hyper-V compatibility and SQL Server 2008 integration.    A few good resources for this are as follows: http://hmorgenstern.spaces.live.com/Blog/cns!28A6BE83102A0EB3!419.entry and http://performancepointing.blogspot.com/2008/11/performance-point-server-sp2-pre.html

Reporting Services 2008 - Connection Problems

Having recently rebuilt a server entirely with "2008" technology I have a small problem with reporting services.  When trying to look at the Reports, I received the error: "The underlying connection was closed: Could not establish trust relationship for the SSL/TLS secure channel." I found, that even though we are not trying to look at the secure site, the certificate causes issues.  For development and testing purposes I found it easiest to remove the certificate.  To do this perform the following steps: 1. Open Reporting Services Configuration Manager 2. Click Web Service URL and click "Advanced" 3. Remove the SSL certificates, both for IPv4 and IPv6 and click ok, and then Apply.  This will remove the SSL certificates, you should now be able to use Reporting Services 2008. 

PPS SP1 and SQL Server 2008 RC0

So, one of my previous posts I explored using PPS with SQL 2008 Feb CTP.  Now with the release of PPS SP1 and SQL RC0, I thought a perfect time to re-test the installation.  With a complete new installation of Windows Server 2008, SQL 2008 RC0, and PPS SP1, the testing has started all over again. The installation of SQL 2008 goes smoothly now, and very easy, as does the installation of PPS and PPS SP1.  However the issues come in after installing all of PPS and then attempting to configure PPS Planning (Monitoring to come later this month).  After Installing SQL 2008, you will also need to install the following items from the SQL 2008 and SQL Server 2005 feature pack: Analysis Management Objects 9.0 (SP2) SQL Server Native Client 9.0 ADOMD.NET 9.0 SP2 After installing of these, we can now get the pre-requisite page, and also allows the use of the relational part of SQL Server 2008, however Analysis Services still does not accept the SQL 2008 instance, after much investigation, it does not seem possible to install PPS, even with SP 1,  to use Analysis Services SQL 2008 instance.  With much regret, I have currently installed an Instance of SQL 2005, just Analysis Services.   This allows the install the complete. Now many people have said that the 2005 Instance can be upgraded to SQL 2008, however I have found that deploying a new model site, still fails once upgraded to SQL 2008. This investigation will continue.

SQL 2008 Feature Pack

Not so much a Pack, as a collection of very useful tools, that can be downloaded here: http://www.microsoft.com/downloads/details.aspx?FamilyId=D68DE3C9-60A9-49C9-A28C-5C46BBC3356F&displaylang=en I will be investigating the Reporting Services Integration with MOSS 2007 very soon, but for the time being here is some information on the downloads, that are being called the SQL 2008 feature pack. Reporting Services Add-in for Microsoft SharePoint Technologies - pretty much the same as the one for SQL 2005, but with the ability to use the enhanced reporting in 2008. SQL Server Native Client - One DLL, that contains OLEDB and SQL ODBC drivers for close application integration, they allow the development of new application that take advantage of some of the new features in SQL 2008. SQL Server 2005 Backward Compatibility Components - Pretty much what it says on the tin,  Data Transformation Services 2000 runtime (DTS), SQL Distributed Management Objects (SQL-DMO), Decision Support Objects (DSO),  and SQL Virtual Device Interface (SQLVDI) are all supported with this feature pack. SQL Server Data Mining Add-ins for Microsoft Office 2007 - Allows you take advantage of SQL Server 2008 predictive analytics in Excel 2007 and Visio 2007. Table analysis for excel allows you to use simple data mining models on data contained within your spreadsheets.  Data Mining for Visio allows you to render data mining models directly in Visio. All in all pretty good stuff for nothing!

SQL Server 2008 - CTP

OK so we should have been getting our mitts on SQL Server 2008 RTM yesterday, but it has been delayed, until Q3 which has been know for some time.  Microsoft did, however, go ahead with the official release yesterday, regardless that the product has not been released to manufacture.  However to keep our appetites "wet" Microsoft have released another CTP (February) which can be downloaded here: Download SQL Server 2008 is just an enhancement to the already extremely successful end to end BI stack that delivers a well rounded platform for the majority of organisations planning, reporting and analysis needs .  Their BI and performance management offering works best when it is implemented as part of a full stack of Microsoft infrastructure that includes SQL Server (including 2008), SharePoint Server 2007 and Microsoft Office (including Excel 2007). This platform is complemented by PerformancePoint Server 2007, and the obvious tight integration with Excel.  Microsoft are also making further inroads into competing with Teradata, IBM and Oracle with improving the relational database engine with features such as partitioned tables and support for partitioned table parallelism that position it more effectively within the large scale enterprise data warehousing market where IBM, Oracle and Teradata all play. Although still not ready for databases that run into the 100's of terabytes and possibly petabyte range,  the company is making in-roads, it currently has customers working comfortably in the one to 15 Terabyte range and expects SQL Server 2008 to extend this into the 20 to 50 Terabyte range.

SQL 2008 - what's changed, and what's new?

As I am sure you can all Guess SQL 2008 is the next release of SQL Server, and does not redesign the wheel, and is more an enhancement to 2005.   Microsoft claim that they have made it more secure, more reliable, more scalable etc... etc...  Business Intelligence: Analysis Scale and Performance: Allows more complex computations and aggregations and brings new cube design tools. Block Computations: Provides a significant improvement in processing performance enabling users to increase the depth of their hierarchies and complexity of the computations. Writeback: New MOLAP enabled writeback capabilities in SQL Server 2008 Analysis Services removes the need to query ROLAP partitions. This provides users with enhanced writeback scenarios from within analytical applications without sacrificing the traditional OLAP performance. Enterprise Reporting Engine: Reports can easily be delivered both internally and externally with easier deployment and configuration. Internet Report Deployment: Allows deploying of reports over the internet.  Report Builder Enhancements: Easily build ad-hoc and author reports with any structure through Report Designer. Report Server Application Embedding: Report Server application embedding enables the URLs in reports and subscriptions to point back to frontend applications. Microsoft Office Integration: SQL Server 2008 provides new Word rendering that enables users to consume reports directly from within Microsoft Office Word. In addition, the existing Excel renderer has been greatly enhanced to accommodate the support of features like nested data regions, sub-reports as well as merged cell improvements. This lets users maintain layout fidelity and improves the overall consumption of reports from Microsoft Office applications. Development: Language Integrated Query (LINQ) See previos Blog Entries on LINQ: Enable developers to issue queries against data using a managed programming language such as C# or VB.NET, instead of SQL statements. Enable seamless, strongly-typed, set-oriented queries written in.NET languages to run against ADO.Net (LINQ to SQL), ADO.Net DataSets (LINQ to DataSets), the ADO.NET Entity Framework (LINQ to Entities) and to the Entity Data Service Mapping provider. Use the new LINQ to SQL provider that enables developers to use LINQ directly on SQL Server 2008 tables and columns. ADO.Net Object Services: The Object services layer of ADO.NET enables the materialization, change tracking, and persistence of data as CLR objects. Developers using the ADO.NET framework can program against a database using CLR objects that are managed by ADO.NET. SQL Server 2008 introduces more efficient, optimized support that improves performance and simplifies development. DATE/TIME: SQL Server 2008 introduces new date and time data types: DATE – a date only type TIME – a time only type DATETIMEOFFSET – a time zone aware datetime type DATETIME2 – a datetime type w/ larger fractional seconds and year range than the existing DATETIME type HIERARCHY ID: Enable database applications to model tree structures in a more efficient way than currently possible. New system type, HierarchyId, can store values that represent nodes in a hierarchy tree. This new type will be implemented as a CLR UDT, and will expose several efficient and useful built-in methods for creating and operating on hierarchy nodes with a flexible programming model. FILESTREAM Data: Allow large binary data to be stored directly in an NTFS file system while preserving an integral part of the database and maintaining transactional consistency. Enable the scale-out of large binary data traditionally managed by the database to be stored outside the database on more cost-effective storage without comprise. Integrated Full Text Search: The Integrated Full Text search makes the transition between Text Search and relational data seamless while enabling users to use the Text Indexes to perform high-speed text searches on large text columns. Sparse Columns: NULL data consumes no physical space which provides a highly efficient way of managing empty data in a database. For example, Sparse Columns allows object models that typically have numerous null values to be stored in a SQL Server 2005 database without experiencing large space costs. Large User Defined Types: SQL Server 2008 eliminates the 8KB limit for User Defined Types (UDTs) allowing users to dramatically expand the size of their UDTs. Spatial Data Types: Build spatial capabilities into your applications by using the support for spatial data. Implement Round Earth solutions with the geography data type. Use latitude and longitude coordinates to define areas on the Earth's surface. Implement Flat Earth solutions with the geometry data type. Store polygons, points, and lines that are associated with projected planar surfaces and naturally planar data, such as interior spaces Security: Transparent Data Encryption: Enable encryption of an entire database, data files or log files, without the need for application changes. Extensible Key Management: Comprehensive solution for encryption and key management. Supports third-party key management and HSM products. Auditing: Create and manage auditing via DDL enabling answers common questions such as "What data was retrieved?" Built-in Forms Authentication: Built-in forms authentication enables users to easily switch between Windows and Forms. Redundancy and Performance: Enhanced Database Mirroring: Eliminates the need to specify the failover partner which enables existing applications to take advantage of mirroring. Resource Governor: Provide a consistent and predictable response to end users with the introduction of Resource Governor. Predictable Query Performance: Enable greater query performance stability and predictability by providing functionality to lock down query plans. Data Compression: Enable data to be stored more effectively and reduce the storage requirements for your data. Data Compression also provides significant performance improvements for large I/O bound workloads like data warehousing. Hot Add CPU: Dynamically scale a database on demand by allowing CPU resources to be added. Declarative Management Framework (DMF):  Use DMF with SQL Server Management Studio to create policies that manage entities on the server, such as the instance of SQL Server, databases, and other SQL Server objects. Performance Data Collection: More extensive performance data collection, a new centralized data repository for storing performance data and new tools for reporting and monitoring. Backup Compression: Keeping disk-based backups online is expensive and time consuming. With SQL Server 2008 backup compression, less storage is required to keep backups online and backups run significantly faster since less disk I/O is required. Partitioned Table Parallelism: Partitions enable organizations to manage large growing tables more effectively by transparently breaking them into manageable blocks of data. SQL Server 2008 builds on the advances of partitioning in SQL Server 2005 by improving the performance on large partitioned tables. Star Join Query Optimizations: SQL Server 2008 provides improved query performance for common data warehouse scenarios. Star Join Query optimizations reduce query response time by recognizing data warehouse join patterns. Grouping Sets: Grouping Sets is an extension to the GROUP BY clause that lets users define multiple groupings in the same query. Grouping Sets produces a single result set that is equivalent to a UNION ALL of differently grouped rows making aggregation querying and reporting easier and faster. Change Data Capture: With Change Data Capture, changes are captured and placed in change tables. It captures complete content of changes and maintains cross table consistency and even works across schema changes. This enables organizations to integrate the latest information into the data warehouse. MERGE SQL Statement: With the introduction of the MERGE SQL Statement, developers can more effectively handle common data warehousing scenarios like checking whether a row exists and then executing an insert or update. SQL Server Integration Services (SSIS) Pipeline Improvements: Data Integration packages can now scale more effectively making use of available resources and managing the largest enterprise scale workloads. The new design improves the scalability of runtime into multiple processors. SQL Server Integration Services (SSIS) Persistent Lookups: The need to perform lookups is one of the most common ETL operations. This is especially prevalent in data warehousing where fact records need to use lookups to transform business keys to their corresponding surrogates. SSIS increases the performance of lookups to support the largest tables.

Install Performancepoint Server on SQL Server 2008

Testing the latest technology on the latest platforms is always very important to us, in order to ensure we are up to date, give our clients an unbiased view of what's out there and what's coming up, and how it will affect them.  We have been involved with PerformancePoint for sometime now and have had good success, however with the impending release of SQL Server 2008 and Windows Server 2008, we thought it time to get up to date with PPS on a Server 2008 box, running SQL 2008.  With Windows Server 2008 in RC1 and SQL Server 2008 in RC1, we know we are now close to testing the fully fledged version.  So, time to install it all.  Sever Installs with no dramas at all, quite hastily in-fact.  Following that is SQL 2008, which also installs without any hicup this time, however does take a very long time to install, and interestingly installs Visual Studio 2005, not 2008 (which is released the same time as SQL and Windows Server).  So we are up and running, we have a perfect install of SQL 2008 on a Windows server 2008 sever, time to install PPS. PPS is a small install, just 60MB download, and 4 different installations: Planning Server, Business Modeller, Excel Add-In, and Monitoring Server.  For the Server Install we just want to install the Planning Server, and the Monitoring Server for the time being.  We will start with the Planning Server. On Checking the prerequisites for this we instantly come up with a problem.  PPS Planning, looks for a specific version of SQL, SQL server 2005, and therefore does not recognise SQL 2008.   This causes somewhat of a problem.   Changing the Registry also doesn't seem to help, while trying to trick the install of the correct version.  All the other components seem to install fine, and therefor the only issue currently is Planning.  An obvious solution would be to install SQL 2005 Side By Side, and simply use the SQL 2008 Database.  However this is not an ideal solution, and I will be continuing work with this in the new year, as well as raising this issue with Microsoft. I will post a follow up to this when we have resolved the installation issues.  In the meantime, have a great new year!      

Install SQL Server 2005, After 2008

So, if like me, you wanted to look at SQL 2008, and thought it best to totally remove 2005, and then install 2008 (Katmai), then realising that the Katmai BI tools did not install properly, you may want to re-install 2005.  If have got to this stage you may have found and annoying installation bug, even after removing 2008. This is regarding the SQL Native Client.  It fails to install and therefore most of SQL 2005 fails to install.  After numerous more efforts to install this there is a simple fix. From the SQL 2005 disc, run Sqlncli.msi  - Choose remove, repair doesn't work!  - Once removed install SQL server 2005 as normal - easy - when you know how!