I’ve just found out that the first CTP for SQL Server codename Denali is now available for download. The link to get both the 32 and 64 bit versions is:
On a related note, Simon Sabin has recently posted here about the Tech-Ed Europe keynote that shows a glimpse of a new Silverlight reporting tool for SQL that’s part of Denali. Well worth watching the keynote video…
Subscription Views are one of the ways to extract data from Master Data Services. Created via the ‘Data Integration’ section of the MDS Master Data Manager front end, they result in the creation of a physical SQL Server view in the underlying database.
All this you will probably know from the MSDN article. The purpose of this post, however, is to explain a few of the more subtle differences between the different types of subscription views that get created, and looking how they can be used to feed a data warehouse.
You can create a subscription view from a derived hierarchy, which will return all the levels from your hierarchy in a de-normalised form. If you choose the sample Product model, then you will see that one of the Derived Hierarchies that you can pick from is called Category. As your Product dimension in a data warehouse would include attributes such as Product Category and Sub Category, then on the surface this view looks to be ideal.
Unfortunately there is a slight issue with subscription views based on derived hierarchies. Users could have altered one of the members that exist within the hierarchy before the data warehouse nightly load has taken place. If the MDS validation of that member hasn’t occurred yet (which it doesn’t in some situations), then the member could potentially contain some incorrect attribute values that we don’t want to expose to the data warehouse.
For example, a user has edited a product, as shown below, but chosen a different Sub Category to that of its adjacent members:
As indicated by the yellow question mark above, the member is awaiting validation. However, when I run a select statement against my view that’s based on Derived Hierarchy Subscription view, then the member that is marked above will still appear in the view results. We would not want to load this change into the data warehouse, as until the business rules have been run, we dont know if the new Sub Category is valid in the context of the other attributes. For reference, the same behaviour exists for members that have explicitly failed validation.
You can also create a subscription view by picking an entity, and then choosing the format of Leaf Attributes, as shown below:
In my case, I’ve set up a subscription view called ProductLeafAtributes, and when I run a select statement against this view in SQL Server, then I get the following results:
As with the subscription view that’s based on a Derived Hierarchy format, I still get all the members returned, but there’s also a column called Validation Status, that shows that one of the members is awaiting revalidation. Therefore, this column can be filtered on in the data warehouse load, just to take through the changes that have definitely passed validation.
Creating the subscription views based on entities, rather than Derived Hierarchies, exposes the validation status of a member, and is therefore the preferred subscription view format to use in this case. The one catch is that the Leaf Attributes format will not return attributes of any related entities, e.g. Product Category. It will return the Sub Category code, so therefore its necessary to join to separate Subscription Views for both Sub Category and Category in order to pick up the missing attributes.
The SQL Bits 7 agenda was published recently, and I’m pleased to say that I will be delivering a session titled ‘End to End Master Data Management with SQL Server Master Data Services’.
The heart of this session will be an end-to-end technical demo that will show the role of MDS in extracting master data from source systems, before demonstrating how data is fed via MDS to subscribing downstream systems. I’ll cover areas such as the MDS staging process, business rules and notifications amongst others.
I’ve also seen from Simon Sabin that the SQLCAT team will be showing some of the features of SQL 11, code name Denali. Well worth going for that alone surely?!
You can register for SQL Bits here.
Back to a bit of Master Data Services this week – there’s been a few learning resources that have appeared here and there for MDS over the past couple of months. I thought that it would be worth rounding up the ones that I’ve noticed:
There was a comment on my last blog post by Peter Eb asking about whether Reporting Services 2008 R2 knows how to render SSAS KPIs automatically. The short answer is that it doesn't unfortunately, but there are only a few steps needed to get it working.
For this blog post I've built a simple report that uses the Adventure Works DW 2008 R2 Analysis Services database as a data source. The first step is to build a data set that contains a query that references the Month attribute and the Channel Revenue KPI:
I'm interested in reporting on the status and trend part of the KPI, so I've added the month and value parts of the KPI, plus blank columns for the status and the trend:
For the status and the trend columns, the next step is then to drag on (or right click and insert) an indicator in each of the cells. This is when the manual steps come in:
- Right click on the indicator to open the properties window;
- Set the Value of the indicator to be the Status part of the Analysis Services KPI;
- Change the States Measurement Unit to Numeric;
- Configure the indicator Start and End properties to be -1, 0 and 1. SSAS KPIs can have an unlimited number of status values, so it may be necessary to add more states here if SSAS returns more than 3 values for the Status.
This changes that are needed to the indicator properties window is shown below:
Applying the same method to the Trend part of the KPI produces the following simple report:
This post wasn't really supposed to compare SSRS Vs PerformancePoint Services, but I've found out that PerformancePoint Services does pick up the SSAS KPI meta data (slight mismatch on the colours only), saving you having to set up the indicator yourself:
It's a shame that the KPIs aren't picked up automatically by SSRS, but then again the above process isn't too difficult to master.
Although you've always been able to deliver KPIs in Reporting Services by using images and expressions, Reporting Services in SQL Server 2008 R2 now includes a feature called Indicators, which assists with producing KPIs out of the box.
This got me thinking how producing KPIs and/or scorecards in Reporting Services compares to doing the same in PerformancePoint Services. This blog post gives a short overview of creating KPIs in each technology, before comparing the pros and cons of each.
Reporting Services 2008 R2 Indicators
The indicators in Reporting Services come as an extra item on your toolbox, and can be dragged onto a Tablix for example. Once you do add the indicator to your report, you'll be prompted to select the shape that you want for your indicator:
So no more messing about with images! Once you click OK your indicator will be added to your report. If you then right click on the properties of the indicator, you can choose what to base the indicator's value on on, and also set its state, both of which essentially control the points at which your indicator changes its colour and shape. What I quite like here is that it's easy to change the colour for each state, and it's also easy to add new states, if you need four colours rather than three for example:
You seem to be able to add a huge number of indicator states here, certainly more than enough to deal with KPIs. Taking some data from AdventureWorks, based around employee sales targets, I've created the following very basic report using the Indicators:
PerformancePoint Services KPIs
The KPI and Scorecard have had a few decent changes in SharePoint 2010 (e.g. drill up/down), but Dashboard Designer is still the tool that's used to create KPIs in PerformancePoint Services, and its a different experience to developing in Visual Studio.
Assuming that you get to the point where you have a data source set up, the first thing to do in order to create a KPI is to pick an indicator. The choice here is very similar to the choice in SSRS:
The alternative here is to pick a blank indicator, where you'll have to decide yourself how many levels you want for your indicator, and what the colours and shapes should be. You can pick an image for an indicator state (as you can with SSRS), although you are limited to a maximum of 10 states. Again, that should be more than enough for KPIs.
The objects within PerformancePoint Services are very modularised, meaning that you create indicators, KPIs and scorecards separately. If you need to re-use an indicator in a new KPI, then you can just point to an existing indicator, or alternatively you can create a new one.
Again, taking some sample data from AdventureWorks, I've produced the following basic scorecard, based on similar data:
The following table lists some KPI/Scorecard features and compares how SSRS and PPS are able to deal with them:
Reporting Services 2008 R2
Out of the box Parent KPIs
No built in parent KPIs
- Two built in scoring methods;
- Weighting concept to assist with scoring.
No built in scoring capability
- Analysis Services;
- SQL Server;
- SharePoint List;
- SQL Server;
- Analysis Services;
- OLE DB, ODBC;
- SharePoint List;
- SAP Netweaver BI;
- KPIs can only exist within Scorecards.
- A scorecard can only contain KPIs and their associated Indicator and data items.
- Indicators can be embedded in other objects or exist by themselves.
- Other objects (e.g. spark lines) can co-exist alongside Indicators in the same Tablix.
- Colours & number formats of the Scorecard grid can be changed;
- Colours/fonts in data cells cannot be changed.
If the Indicator is embedded in a Tablix, then each cell can be altered in terms of Colours and number formats.
Ease of Use
As long as your data is in good shape, its possible for power users to build KPIs and Scorecards with Dashboard Designer relatively easily.
The indicator windows are very intuitive in SSRS, but I'd say that building an SSRS report is harder for a power user than building a PPS dashboard.
Scorecards allow users enter comments against KPI values, which is a great feature.
The standard SSRS expressions allow you to fine tune indicators, states, values and formatting of your KPIs.
In summary, PerformancePoint Services in SharePoint 2010 will allow you to get some great looking KPIs and Scorecards up and running much quicker, as that's what it's designed to do. It is, however, more restrictive than Reporting Services, so SSRS is the better choice if your KPI requirements are complex in terms of layout or formatting.
With the release of SQL Server 2008 R2 nearly upon us, it's a safe bet that the number of technical articles for MDS will start to increase a bit. In the meantime, I felt it was worth mentioning a few older MDM White Papers that may come in useful:
Although I didn't mention it, the inspiration for my last post was the excellent MDX Solutions 2nd Edition. Whilst flicking through the book a while back, I though it would be interesting to see how DAX and PowerPivot could handle the so called 'Common Calculations and Selections in MDX'.
This post continues that theme, focusing on one calculation in particular that's in the aforementioned MDX book, namely 'Unweighted Allocations down the Hierarchy'. Essentially this business problem is dealing with allocating data that is entered at a higher level (e.g. Quarter) down to a lower level (e.g. Day). It varies from business to business what the users actually want to see at the day level in this situation. Some expect to see the Quarter amount, some want to see a blank value and then some users want the amount in the quarters allocated down to the day level.
If the expectation is that data should be allocated, then one way of doing the allocation is to use a ratio. In the MDX book, this is achieved by the following MDX:
If we put this into a query and run it against the Calendar hierarchy in AdventureWorksDW we get the following results:
Therefore our quarter-entered value can simply be multiplied by this ratio to get the correct value at each level in the Calendar hierarchy.
So the challenge is how to be able to do this in PowerPivot. I started off by looking for some data in AdventureWorksDW that would fit the bill. A good candidate seems to be the FactSalesQuota table, which I imported into PowerPivot along with its related dimensions:
Once in the pivot table, I wanted to be able to calculate the correct ratio at the daily and monthly levels. For the daily level this is 1 / 90, as there are 90 days in the Q1 that year, and for the month level it is 1/3 as there are 3 months in the quarter.
Given that there's no MDX style Ancestor() function in DAX, I ended up having to have a different calculation at different levels. Something that I learnt from this forum post is that you need to use IF() in order to isolate a calculation at a different level. Therefore the DAX that I came up with was:
=IF( COUNTROWS(VALUES('DimDate'[FullDateAlternateKey])) = 1,
1 / CALCULATE( COUNTROWS('DimDate'), ALL('DimDate'[FullDateAlternateKey], 'DimDate'[EnglishMonthName]) ),
1 / CALCULATE( COUNTROWS( DISTINCT( 'DimDate'[EnglishMonthName]) ), ALL('DimDate'[EnglishMonthName]) )
The COUNTROWS(VALUES('DimDate'[FullDateAlternateKey])) = 1 allows me to check that we're at the day level, and if we are at the day level, then count the number of days in the quarter. Otherwise, if we're at the month level we calculate the number of months in the current quarter. When this DAX is put into a new measure we get the following results:
This produces the same results as the MDX statement. The next step was then to apply this ratio to the Sales Amount Quota for the current quarter. This was achieved using the following DAX:
COUNTROWS( VALUES('DimDate'[FullDateAlternateKey]) ) < 90,
All('DimDate'[FullDateAlternateKey], 'DimDate'[EnglishMonthName]) ) * 'FactSalesQuota'[Ratio],
The < 90 at the start of the statement allows me to check if we're at the day level or the month level. If we're at either of these levels, then we want to pick up the Quarter-level SalesAmountQuota and multiply by the ratio. Otherwise, at the Quarter and Year levels, we just apply the original SalesAmountQuota value. This produces the following results:
The use of the MDX Scope statement would have been nice to pinpoint exactly where I wanted the calculation to occur, but then again, the use of IF() keeps it quite simple, which is better for an Excel power user. Whilst MDX is very very powerful, can you imagine a business user getting to grips with MDX statements such as Scope or Freeze?
The bottom line is that the DAX calculation produces exactly what I want - the new AllocatedSalesQuota measure is correct at all levels in the Calendar hierarchy. Even with an IF() statement, like everything else in PowerPivot, the calculation performs very quickly.
One of the often required calculations for users in reports is to be able to calculate a point-in-time closing balance, which is especially relevant when dealing with stock levels or financial data.
In the Analysis Services world, at least in the Enterprise Edition, we're lucky that we have a variety of Aggregation Functions that can deal with semi-additive measures such as a closing balance. Or we can always turn to MDX if we need to which will achieve the same thing.
In PowerPivot, we don't have Aggregation Functions, but there is DAX, which has a variety of time intelligence functions up its sleeve. The idea behind this post is to see how PowerPivot would deal with the requirement of reporting a closing balance measure.
DAX Time Intelligence Functions
One of the DAX functions that we can use for our closing balance is called ClosingBalanceMonth(), which will simply evaluate an expression at the end of a given month. There's also the similar ClosingBalanceQuarter() and ClosingBalanceYear().
Having noticed these functions within PowerPivot, my first port of call was to set up a simple PowerPivot model by taking data from the FactInternetSales table in AdventureWorksDW, plus all it's related dimensions of course. What I ended up with was the following simple pivot table in Excel 2010:
Sales Amount obviously isn't a semi-additive measure, but I'm treating it as one in this example. My first port of call was to use ClosingBalanceMonth() to try and get the value for the month to be equal to the value for last day in the month. I managed to come up with the following formula:
=CLOSINGBALANCEMONTH(Sum([SalesAmount]), 'DimDate'[FullDateAlternateKey], ALL('DimDate'[FullDateAlternateKey]))
This produces the results that are shown below, in the measure called ClosingBalanceFormula:
It does a good job of calculating a closing balance for each month, but it presents that closing monthly balance on each day. Therefore, I started looking for an alternative approach and found the LastDate() function. It returns the last date that's in context for the passed date column. The last date that's in context at the month level will be the last date in the month, but thankfully the last date that's in context at the day level will be the current day. Therefore I came up with the following formula:
='FactInternetSales'[Sum of SalesAmount](LASTDATE('DimDate'[FullDateAlternateKey]))
This produced the results that I wanted - the daily closing balance at the day level and the monthly closing balance at the month level:
A bit more digging lead me to find out that a combination of the Calculate() and LastDate() functions gets to the same result, as an alternative. That formula is:
It's a shame that the ClosingBalanceMonth() formula didn't quite work, but perhaps getting that approach to work will come out in the wash with more DAX experience. However, as with MDX or any kind of expression language, there's always going to be a load of different ways to get to the same result. It's much more efficient to work knowing that you've got several ways around a particular problem, rather than being forced down a particular path, and in fact, I think the range of calculation functions available shows the richness of the DAX language.
Technorati Tags: PowerPivot
A consultant that I know has recently been through the process of reviewing his company's management information requirements, and also the process of evaluating what technology platform can deliver those requirements.
Naturally, a data warehouse was one of the options. A data warehouse, in my opinion, will provide the enterprise-level platform for any organisation to deliver their management information.
In this case though, due diligence flagged up some alternative approaches to a data warehouse, namely the raft of self service BI tools that are out there, including PowerPivot. Other tools that can take data in from a variety of data sources are Tableau (which is also a good front end for Analysis Services) and Spotfire for example.
I think that all of these tools have their merits. PowerPivot is incredibly fast and easy to use, whereas Tableau and Spotfire for example have some handy data visualisation capabilities. Some vendors actually claim that self service tools can replace a data warehouse. Can this really be the case? Whereas these tools do a good job of presenting data in a dimensional model, my personal view is that a data warehouse of course also carries this out, but in addition delivers the following benefits to name a few:
- Data Cleansing - The power of SQL in the backend and also features like Fuzzy Matching in SSIS mean that you can address the huge problem of data quality. The bonus here is that all data quality issues could be logged, meaning that you can go back to the owners of source systems and let them know that their data is in a mess.
- Control - If you do encounter data quality issues, you may decide that they are actually too bad to load into the data warehouse and present to your users.
- Transparency - if you want, you can choose to expose the data quality level of your facts to users, via the Kimball-style audit dimension. E.g. we may have noticed that within our retail data, one store has abnormally low sales. Therefore, we can assign this fact a lower data quality score than the other sales by store for the day, which lets the users know that the data quality steward is aware of the issue, and is looking into it.
- Slowly Changing Dimensions - A lot of source systems don't behave. E.g. if we're tracking a customer's Marital Status of 'single' or 'married', you often find that source systems only record the latest change to a given field. Without addressing slowly changing dimension issues in a data warehouse, we may never know that Customer X is now single, but once was married.
- Conformed Dimensions - When we have multiple data sources for a single dimension, the goal is to ensure that each dimension available to end users represents a single, consistent view of that dimension. Due to the often complex matching and de-duplication that's required, it's difficult to see how this would be possible without a data warehouse.
- Late Arriving Dimension Rows - Going back to the 'Marital Status' attribute, we can get the situation where a source system only updates a particular field a long time after the event actually occurred. E.g, if we think that a customer called John Doe is single, but then today we're told he's married, but has actually been married for the last two years. If Marital Status is an important piece of management information to the business, then we need to update all data associated with John Doe over the last two years, in order to reflect his correct Marital Status over the last two years.
- Complex Transformations - I recently encountered a source system where one of the columns in a SQL table held data in a varying length format of ‘#Product Type#Currency#Value, #Product Type#Currency#Value’. So what you basically have here is potentially say 25 records held in one row in a SQL table, whereas another table might have just 7 records within one row. Rows could also be badly formed! We ended up using a bit of C# within SSIS to split this data out so that the sub-elements could be analysed correctly.
I could go on, but I think as a few simple examples the above will do.
Whilst I'm an advocate of data warehousing, I also think that the self service tools, such as PowerPivot, can be complementary to a data warehouse. You'll often find, for example, that there's some data that's not in the data warehouse yet - after all some organisations have a lot of data, and it will take time before all the business analysis etc can be carried out to get this data into the warehouse. Another example is proto-typing, analysts can very quickly build a proto-type model, which can then be production-ised at a later stage. I think this video from Chris Webb is really worth watching - it covers these examples and a few more.
In summary there's always going to be a need for users to go digging for data and producing their own reports - the data warehouse can't always deliver everything immediately. PowerPivot is going to be great in this situation, but it will work best when it's used alongside cleaned and conformed data, which is exactly what a data warehouse provides.
MDM tools give the control of the enterprise master data over to the data stewards and power users, rather than relying on automated data integration alone.
Master Data Services is no exception to the above. One of the ways that this is true for MDS is that it allows users to inspect the transactions that have occurred (either internal to MDS or from a source system) and choose if they want to reverse them.
In order to achieve this MDS has a useful log of all transactions that's viewable by users. Here's an example of some transactions that have occurred in my test system - some are from data that I've loaded up via the staging tables, some are from manual member additions that I've carried out in the front end, and some are from business rules that have automatically run:
In the model that this example is taken from, I've got some business rules that look to address data quality issues. Taking the Kimball view on data quality issues in a data warehousing context - many can, and should, be addressed in the source system, then re-loaded. That isn't always possible, which is one of the reasons why we have business rules in MDS. However, as good any sort of automated rule is - there are always exceptions.
In the transactions shown above, an automatic business rule has run that checks a Customer's overdraft limit, then sets it to 10,000 if its over 10,000. Therefore, when a value of 50,000 was encountered for Member Code 10311, the MDS business rules kicked in and quite correctly did their job. This was not what I wanted in this particular case.
Thankfully we can click on the undo button that's shown above the grid, and reverse a chosen transaction, whether its come from a source system, a business rule or a manual edit. It doesn't seem possible to reverse many transactions at once, but that may be just due to the CTP. In my example, by selecting the first transaction in the list, then clicking the undo button, I've reversed my automatic business rule. Therefore, the user Kaylee Adams (10311) shown below now has her original overdraft limit:
In conclusion, when some sort of manual intervention is needed to successfully manage master data, MDM tools allow that intervention to come from the power users, rather than having to wait for someone technical to address the issue.
Following last week's release of the November CTP for SQL Server 2008 R2, the latest CTP for PowerPivot is now also released!
The links for the downloads are:
32 Bit PowerPivot Download
64 Bit PowerPivot Download
PowerPivot Main Download Page
Included in the new CTP3 features are several new DAX functions, especially quite a few around time intelligence. Some look quite comparable to MDX, whereas others possibly give even more functionality than MDX. This is something I hope to blog about soon.....
I've been keeping an eye on the SQL Server 2008 R2 CTPs over the past few months, but have been compelled to start blogging again following the release of Master Data Services (MDS) in the November CTP.
The idea of a Microsoft MDM tool first caught my attention with the acquisition of Stratature, and since then I've seen a few talks on the subject, such as Kirk Haselden's talk on the subject back at the BI Conference last year.
Now that I've got my hands on it, I've decided to cover the set up of business rules in MDS. Business rules are key to an MDM solution. If we want to use MDM to load data from disparate source systems, we will definitely have to carry out a lot of cleansing and confirming in order to ensure that the end users only consume clean and accurate data.
To set the scene a bit, I've created several entities in my MDM model, namely Customer, Country and City. These could form a Customer Geography hierarchy for example, but for the moment I'm going to focus on Customer. The following shows the Customers that I've entered manually:
When I add a Customer (via the button that is circled) or when I edit a customer, the third column of the grid for the relevant member will change from a tick to a question mark, indicating that data validation has not taken place.
For this example, what I want to happen is for the Overdraft Limit attribute to validate that it is within normal boundaries that have been set by the business, e.g. a bank. To do this, I'm going to set up a simple business rule.
Selecting Manage->Business Rules will take you to the Business Rules Maintenance screen, where the 'plus' icon will create you a new business rule. Editing the new blank rule will give a screen with a basic IF....THEN GUI to produce a basic business rule. On the IF part you pick conditions such as greater than, less than etc, alongside an all important dimension attribute. You do this by dragging and dropping conditions, in the screen below:
In my case I've picked that the overdraft limit can't be greater than £10,000, and if it is greater, then set it back to £10,000. This will do for now, but I could have prevented validation from succeeding, or caused MDM workflow to start. Clicking the MDS back button will take us back to the business rules maintenance screen, where the rule is not active until we publish it:
Once we do publish the rule, it will kick in whenever validation runs or when you manually run the business rules. In my grid of Customers above, I have an overdraft which is a mistake. When I validate the Customer entity, the 5555555 for the second customer automatically reverts to £10,000, as shown below:
This is quite a simple example of course, and via manual editing. The real power of these business rules will come when loading masses of data from source systems, with the added power of workflow to prompt business users to deal with the validation issues that may occur. I'll aim to post about integrating from other systems via my next post in due course....
Interesting to see yesterday that XLCubed have released version 5 of their popular OLAP reporting tool. As I've mentioned before, I've always found XLCubed to be very powerful and easy to use, and therefore, in my opinion, it's definitely one of the best Excel OLAP reporting tools out there.
Version 5 sees the introduction of some new features in the Excel/web editions, namely:
- Enhanced Excel integration with 'MicroCharts';
- An improved user interface in the Web Edition;
- Print to PDF and enhanced save to Excel options in the Web Edition;
- Fine grain control over publishing Excel reports to the web.
Personally, I'm used to using the Excel Edition to slice and dice cubes to get what I want quite easily. Although it's not entirely new to V5, you can carry out the same kind of ad-hoc analysis in the Web Edition, as shown in the following video. The end result is that you are able to build the kind of dashboards as shown below, whether in the web or the Excel edition:
As usual, the evaluation editions for the Excel and Web versions can be downloaded here
I've noticed that the PPS Technet documentation has been updated recently to include an official Microsoft method to carry out auditing in PPS Planning.
PPS will do some basic auditing out of the box, namely to the audit.log file on the server. This will automatically capture key events that occur on the server, e.g. creation of a model, updating of a dimension etc. The audit file does not, however, track changes to the model fact data. There has been a custom solution around for this for a while now - Sacha has written an excellent post here that details what you need to do in order to implement your own PPS audit trail.
Like Sacha's method, the Microsoft approach involves creating auditing tables, which should then be populated by running a custom stored procedure. The stored procedure should then be scheduled on a periodic basis (e.g. hourly) to capture any new activity. This is a bit different to Sacha's method, where triggers are used to capture changes in real-time as they occur. In both cases the idea is to use something like Reporting Services to to view detailed auditing reports on your PPS data.
One thing that did catch my eye on in the Technet documentation is a method to decode the binary 'change list' column that's held in the dbo.Submissions table. Whereas you can manually export the change list to a CSV file, there has historically been no way to take what's in the change list column and automatically decode it into a useful format. The following C# code will read the change list, and then insert it into your newly created auditing table:
DataSet ds = new DataSet();
DataLayer dl = new DataLayer("PPSConnection");
ds = dl.ExecuteDataSetFromSQL("SELECT [SubmissionID]FROM [_AppDB].[dbo].[Submissions] s1 where
s1.SubmissionID not in (select SubmissionID from [_StagingDB].[dbo].[SubmissionsAudited])
and s1.[Status] = 0");
string sSQL = "";
foreach (DataRow r in ds.Tables.Rows)
sSQL = @"INSERT INTO SubmissionsAudited(… ) VALUES(";
// RETRIEVE THE CHANGELIST FOR THIS SUBMISSION
DataSetWrapper dsw = new DataSetWrapper((Byte)r["ChangeList"]);
foreach (DataRow cldr in dsw.DataSet.Tables.Rows)
// SUBMISSION ROW DATA
sSQL += r.ToString() + ", "
+ r.ToString() + ", "
+ r.ToString() + ", "
+ r.ToString() + ", '"
+ r.ToString() + "', ";
// CHANGELIST ROW DATA
foreach (object o in cldr.ItemArray)
sSQL += "," + o.ToString();
sSQL += ")";
// STORE EACH CHANGE TO THE AUDIT TABLE
Click here to view the Technet documentation.
More Posts « Previous page
- Next page »