Callum Green's Blog

Query Parameters in Power Query – Part 1

A new feature has recently been added into Power Query – called Query Parameters.  This was actually released to Power BI as part of May 2016’s update.

So what is a Query Parameter?  For anyone familiar with Microsoft Reporting Services, it has pretty much the same functionality as a Report Parameter.  In simple terms, it enables you to filter your data dynamically in Power Query.  There has always been a Filter UI function but it could not be linked to a parameter or plugged into the M code.  We can use Query Parameters as a way of reducing the data volumes brought into a PowerBook (useful for large datasets or tables) or to allow a user to switch between filters (e.g. changing database connections dynamically).

In this blog, I will concentrate on how we can reduce data volumes by implementing a Query Parameter.  Look out for Part 2 in the coming weeks, where I will pick a use case for when we would want to dynamically switch between parameter values.

Reducing Data Volumes

Imagine transforming or just analysing a fact table that contains over 1 billion records.  Not only will Power BI (and Power Query) probably crash or at best, perform badly, there is no need to bring all this data back into a PowerBook.  A subset, or perhaps an individual month is all that is required to undertake the necessary Power Query ETL operations.

The below example demonstrates how a Query Parameter can be used to filter records in a given dataset.  I have used the AdventureWorksDW2014 Database (download here) and picked the FactProductInventory table, which contains 776,286 rows.  I will reduce this count by adding a new Query Parameter in Power Query – using the Datekey to find all rows in between the 31st November and 30th December 2010.

1.       Assuming I have already loaded the data into the workbook, in the Query Editor, click Manage Parameters from the ribbon.


2.       Create two parameters – one named DateKeyFrom and the other DateKeyTo.  The former contains the value 20101131 and the latter 20101230.  Keep the default settings and confirm by clicking OK.  You two parameters should look like the below:



3.       Click on the DateKey column drop down and choose Number Filters and Between – highlighted below.





4.       Set the filter to pick up the two newly created parameters.  The following screenshot illustrates the criteria required, with the key change involving changing the second drop down from ‘Decimal Number’ to ‘Parameter’.


5.       Check that your dataset has been filtered and only contains data for the time period December 2010.

6.       If you need to change the date criteria and let’s say, only display 5 days of data, this can be achieved easily clicking the parameters in the Queries pane.  I have edited the DateKeyFrom value to 20101225.  You can also update a value by clicking the Manage Properties button.


7.       As soon as you click back onto the FactProductInventory table query, the data automatically begins to refresh.  We are now left with only the last 5 days in December, shown below. 




As Query Parameters are hot off the press, I have only just started experimenting with it.  It looks like it has been added to not only reduce and filter on data, but to enable us to plug a dynamic parameters into M.  Previously, I used a Function as a workaround but can now actually use them in the proper context.  Click here to view a previous blog, where I specifically create and invoke a function.  There are a number of other uses for Functions and Rafael Salas’ blog provides an excellent walkthrough.

My next blog will focus on how you can switch between different parameter values and dynamically display the results.  If anyone has built their own Query Parameter examples and want to share them, feel free to leave me a comment below.  I would be very interested to see how far people have already taken this new feature.

Further Reading

There are already a few great blogs and use cases out there:

1.       Chris Webb’s Blog -

2.       Soheil Bakhshi Blog (1 of 2) –

3.       Soheil Bakhshi Blog (2 of 2) -

Flat Files in Power Query

The out-of-the-box functionality in Power Query will enable you to import pretty much any flat file, ranging from csv to fixed width.  However, there are instances when badly formatted files cannot be loaded correctly and this blog will demonstrate a workaround for this. 

Unusual Flat Files

Whilst csv and txt are the most common types of flat file, I have worked on a number of BI projects where the data warehouse needs to ingest fixed width, ragged right or pipe delimited text files.  You won’t be surprised to hear that Power Query recognizes all of these formats!   I will demonstrate below how simple it is to import a Ragged Right file with a ‘.in’ extension. 

Ragged Right in Notepad:


Power Query’s Interpretation:



Badly Formed Flat Files

You won’t normally know that a flat file is badly formed until actually importing it.  In the below example, a csv file contains both missing and double commas.  Csv stands for ‘comma separated values’, so Power Query expects a comma to split each column.  Any developer who has imported data (through SSIS, SQL import, etc.) will have encountered a problem like this.

Two Commas in Notepad:


Two Commas in Power Query:

As expected, Power Query believes there is a blank column between the values ‘Liverpool’ and ‘Stoke’.  As a result, the data is shifted to the right.


Missing Commas in Power Query:

A missing comma between columns works the opposite way.  Data is shifted to the left.



As we already know that the csv file contains both two and missing commas, the next step is to import the data in a non-delimited format.  To do this, you need to start a new Blank Query and type the following code into the Advanced Editor:


Value = Table.FromList(Lines.FromBinary(File.Contents("D:\Documentation\



The code is turning the flat file data into a table and treating every row as an individual column – hence the ‘SplitByNothing’ syntax.  This is how the query looks:


We can now replace two commas with just one comma and split the table by ‘,’.  I have previously shown you how to Replace Values in an earlier blog, but will quickly demonstrate the Split Columns function.


Power Query actually auto detected all of these settings, as it can see the frequent number of commas and that they occur 10 times.  If you drop down Advanced options, you can change the delimiters and output columns if they are not interpreted correctly by Power Query.   Finally, we can implement the Use First Row As Headers UI function and filter the nulls in the ‘HTR’ column.


There is no easy way to write automated logic that would fix these three records because the missing commas were in different places and in our current query, different columns.  There are also no string patterns that can be easily deduced.  Therefore, this time around we will use Replace Values to move the data to the required columns.


There it is - a very effective way of visualising and managing both unusual and badly formed flat files in Power Query. 

Next Steps

We could easily extend the code to handle other file types or delimiters. Ken Puls’ blog demonstrates how you can pass the flat file connection string in as a parameter within a function.  This would be very handy for someone wanting to bring in multiple flat files through one configuration.  I would be very interested to see if any other Power Query developers have their own M code that handles flat files.  Feel free to comment or contact me directly if you have any suggestions on how to extend my example further. 

ETL in Power Query

If you have read any of my other Power Query blogs, you will have seen the results of transformed and mashed up queries - but not how it was achieved.  This article picks out my favourite 5 Extract, Transform & Load (ETL) features in Power Query.


Being a keen sports fan, I wanted to find out the medal history of the 100m Olympic Finals for both men and women. 

I’ve picked two datasets, both from different sources.  You can download them using the links below:

-          Source 1 - CSV

o   ISO Countries.csv -

-          Source 2 – Web


o   There is a lot of other data on the site but for this example, I have only picked the race result data tables - Men [edit] and Women [edit].

Please Note:      I am happy to provide the Power BI Workbook upon request.  Leave a comment below and I will get        back to you ASAP.  

The imported data (pre transformed) should look like the below in Power Query.  You may notice some ETL has already taken place, but this was automatically carried out by Power Query and includes simple tasks like displaying column headers.



There are many occasions when we need to completely flip the contents of a table.  For our example, ‘Gold’, ‘Silver’ and ‘Bronze’ should not be columns, but rows instead.  We only need one column and this can be named ‘MedalType’. The un-pivot feature can be achieved (on a whole query) by just one click of a button.  Ensure that following columns and features are selected:



Once the data is un-pivoted, rename the new columns to ‘Athlete’ and ‘MedalType’ respectively.  Repeat the process for the Men [Edit] query, which looks like the following:


Replace Values

When quickly eyeballing the data (in Men [Edit]), it is apparent that two athletes shared a Bronze medal in 1896.  We need to handle this data defect but unfortunately, there isn’t a one click method of achieving this.  The easiest way is to use the Replace Values and Remove Top/Bottom Rows UI functions.

Firstly, let’s duplicate the full Men [Edit] query by right clicking on it and selecting Duplicate.  We now want to remove all rec0rds from the copied query, apart from the one that contains two bronze medalists.  To do this, right click on the little table button in the corner of the query and choose either of the following:


It will then specify how many top or bottom rows you want to remove.  Ensure the only record left looks like the below:


Lastly, the Replace Values function is used to replace ‘Francis Lane’ with an empty string and therefore, removing the duplicate.  The aim is to insert this one record back into the main Men [Edit] query.  Right click on ‘Athlete’ and select Replace Values.  Type the below text in the function box and click OK.

Value to Find:   Francis Lane (USA)#(lf)#(lf) Alajos Szokolyi (HUN)
Replace With:   Alajos Szokolyi (HUN)

The ‘#(if)’ elements of the search string is looking for special characters, otherwise the replace will not work.


We must tidy up the Men [Edit] query by doing the same as above, but the opposite way around.  This leaves us with the following:


Append Queries

This is another neat feature in Power Query.  In SQL Server or SSIS terms, this provides a union between two queries.  If the column names and data types do not match, then it may fail or give unexpected results.

Open the Men [Edit] query, Click the Home tab in the top menu pane and select Append Queries.  A pop up window will open, where you can specify the query to append into Men [Edit].  For this example, I selected the below and clicked OK.


-          It is possible to append multiple tables and in fact, we need to do this in order to bring both the Men [Edit] (2) and the Women [Edit] queries into Men [Edit].

-          To make it easy to distinguish between male and female, I have added a new column to the queries that specifies gender.  This is not mandatory, especially if you are following this example.

-          If we select the Two tables option, then only one query can be appended at a given time.

We can rename the newly appended query to Athletes and tidy up the Men [Edit] (2) and Women [Edit] queries.  Putting raw queries into an appropriately named Group helps distinguish each part of our ETL – see below:


There are more queries now but the Raw folder contains imported, untouched queries, whereas Transform includes tables that have been modified and joined together.  The Load folder will eventually contain the star schema tables (dimensions and facts).

Add Custom Column

Any developer using ETL will have needed to derive a column before.  Power Query offers this functionality, both through the UI and M query language.  We want to create a new column called ‘Nationality’, that moves the characters (in between the brackets) from ‘Athlete’.

In Power Query, a derived column is called Custom Column.  Ensure the Athletes query is selected, then navigate to the Add Column tab in the top menu pane and select the Add Custom Column button.  The pop up box appears, where we can write some custom M code:


-          Text.End is the equivalent to the RIGHT function in SQL Server.

-          Text.Start is equivalent to LEFT.

-          Wrapping them both into one line of code ensure the string inside the brackets is taken from Athletes.

-          Rename the new column to Nationality.


Merge Queries

This feature is the same as applying a JOIN in SQL Server.  In simple terms, it gives us the ability to mashup and bring together queries from different data sources.  We will bring the ISOCountriesRaw data into the Athletes query.

Under the Home tab, select Merge Queries.  A new window appears, where we can specify the tables and columns that need to be joined.  Simply click select the ISOCountriesRaw query and the two columns shown in the below screenshot.  The Join Kind should be a Left Outer, as this will bring ONLY matching records across ISOCountriesRaw, keeping all records in the main Athletes query.


The wizard will also give you a match preview and for this example, 142 of the 144 records in Athletes matches the ISOCountriesRaw table.  Click OK to complete the merge.

There are a number of columns that aren’t needed in this table and can be filtered out accordingly:


There are two records that don’t match because the medal winner was subsequently removed from the record books.  We can use the Replace Values function to replace the nulls with some more meaningful information (see below).  We are now ready to turn the data into a star schema!


Next Steps

Now we have a fully transformed query, it can be broken down into a fact table and dimensions.  I will not demonstrate in this blog but by following Kimball’s data warehousing principles, there is now a star schema that can be plugged into Power Pivot or Power View.

NOTE:   Add Index Column was used to create primary and foreign keys on the ‘Load’ tables.  See ‘Other Features’ for link on how to do this.





Other Features

There are many other ETL related functions within Power Query that have not been discussed in this blog.  It is also possible to enhance the UI features using M or write specialist code to enhance and configure your ETL process, which I demonstrated above.

For further information, I would recommend reading the below articles:

-          5 Very Useful Text Formulas - Ken Puls

-          Add Index Column – Microsoft

-          Split Columns - Mark Vaillancourt

-          Group By – Microsoft

-          ETL Examples – Chris Webb

Parameterise Multiple Queries in Power Query

The purpose of this article is to demonstrate the simplistic power (excuse the pun) of Power Query and how you can configure numerous queries to run from one single parameter.  The concepts covered are similar to what was discussed in my last blog but instead of configuring iterated outputs, we will use one query to derive the data in a number of other queries.

Please Note:       Substantial ETL was required to prepare the data before any parameterisation could take place.  I will not be showing this in today’s blog but am happy to provide the full workbook upon request.  


Continuing on with the football theme, I want to find out the Premier Leagues top goal scorer for any season, ranging from 1992/93 all the way up to the present day.  The aim is to pass a value into a workbook query, which will determine the data shown in our formatted, transformed query.

“This sounds exactly the same as you last blog post”, I hear you say.  Well, it is similar – but I will show you how to pass a given query value in as a parameter into another query.  In other words, we can use basic configuration to drive our output. 

Import Data

Firstly, to follow this example, you need to import the 15/16 premier league leading scorer web data into Power Query.  This can be found at:-

You will see that there are over 30 tables that could be imported.  Upon further inspection, each player is put into their own table and we will need to bring most of the tables in.  I decided to pick the top 20 players and any associated data we may need, which is Table 3 – Table 23.

Once the tables are opened as queries in Power Query, rename Table 3 to ‘Stats’, Table 4 to Rank1, Table 5 to Rank2, etc. – all the way up to Rank20.  The queries will now look like the below:


Config Table

If you haven’t already noticed in the above web address, ‘saison_id’ is the parameter we can pass different years into.  We will now set up a one value config query (using the Enter Data button) that will be used to derive ‘saison_id’.  Save the query and give your column a meaningful name – see below.


Referencing Config Value

The next step is to actually reference the config value and nest it into the web address connection of all our other queries but for this example, we will use the Rank1 query.  Use the Advanced Editor to edit the M code.  Here you will see the web connection but before we add our table value in as a parameter, we need to declare it – just like you would do in any other programming language.  The code looks as follows:


-          ‘YearParam’ calls the Config query, which I named ObtainYear.

-          ‘YearParamValue’ find the ‘{0}’ positioned record, which in Power Query, means the first row.  ‘[Year]’ is the name of the column and can be seen in a screenshot above.

-          Finally, ‘YearParamValue’ is called in the web connection string, replacing the hardcoded value of ‘2015’.

We can now close the Advanced Editor and test that the code works.  To do so, change the config query value to ‘2014’, by using the Replace Values option in the top menu ribbon. The value in Rank1 should now read as ‘Sergio Aguero’.


Apply the same code to all of the other queries so that all values will change when a different year is entered into the ‘Year’ column of ObtainYear.

Automating Config Query

To advance our workbook even further, we can turn the ObtainYear query into a function.   This will automate the result in all our other queries and enable the export of different datasets.  I have shown you how to create a function in a previous blog, but have still provided the syntax required at the beginning of the M code in Advanced Editor.


Finally, we can Invoke the function by entering a year.  In my example (not previously shown), I have invoked the year ‘1999’ in the ObtainYear function.  The below output is the result of a transformed query that brings all the data from the other queries together.


There it is, a fully dynamic import process that will connect to a web site and import data determined by a query parameter!  This is all achieved using Power Query’s intuitive UI, with the extra help of the more configurable and flexible M native language.

Other Uses

This article may use web scraping as the example, but you can adopt the same approach to looping through text files, database objects or blob storage.  I have now demonstrated how you can use both function and query values as parameters but with the potential to combine the two, creating ETL processes in Power Query could be very powerful.

I am just scratching the surface here and aim to explore more ETL automation very soon.  Rest assured I will share all my findings - so look out for future blogs.

Loop through Multiple Web Pages using Power Query

As detailed in my previous blog, scraping website data in Power Query is very simple.  The aim of this article is to demonstrate how you can advance basic web scraping and pass parameters into the Power Query web connection string.  


I wanted to obtain the league table history of the English Premier League – starting from 1992 all the way up to the present day.  I managed to find a very useful website that gave me all the information required but the problem was each web page equated to one year’s league table.  As a result, I would have to import the data into Power Query over 20 times, hardcoding the specific web page every time.

There must be a better way surely?  Is there not something in the web address that can be parameterized or a function in Power Query that can support iterations?  Let’s find out……

Parameterize Web Connection

Firstly, to follow my example, you need to import the 92/93 league table data into Power Query.  This can be found at:

I carried out some basic formatting and ETL on the query, which is not shown in this blog.  If you would like a copy of the Power BI workbook, please reply to this post.  Here is how the league table looks in Power Query:


If you look at the web address link above very closely, you will see that part of the connection string contains the league table year – ‘1992’.  This is what we want to parameterise.  The best way to use a parameter in Power Query is to create a function.  This is just like a SQL Server function, which passes variable(s) through it to obtain an end result.   Instructions are given below.


1.       Open the ‘LeagueTable’ query, navigate to ‘Home’ in the top menu and select ‘Advanced Editor’.


2.       This will now open all of the applied steps in ‘M’ Power Query language.  Firstly, we need to write the code to create the function.  This is achieved by writing the following code.




(Year as number) as table =>


-          ‘Year’ is declaring the variable to pass through, with the data type of ‘Number’.  For this example, only one variable is needed.

-          ‘as table’ is defining the result set based on what the function returns.

-          ‘=>’ is to tell Power Query the code below is what will be contained in the function.

3.       We now need to change the web connection string and pass through our new ‘Year’ variable.  This is achieved by replacing the ‘1992’ part of the string.


Source = Web.Page(Web.Contents("
league/tabelle/wettbewerb/GB1/saison_id/" & Number.ToText(Year))),

-          ‘2012’ string has been replaced with the ‘Number.ToText(Year)’ function and variable.  The ‘Number.ToText’ function helps convert the variable into a string and make it readable.

-          The ‘&’ is used to concatenate the web connection string and the variable together.

-          The variable has now replaced the hardcoded year.

4.       When you are happy with the code, click ‘Done’.  Assuming the syntax is correct, the ‘LeagueTable’ query will now display an ‘fx’ symbol next to it.  This illustrates that it is now a function.


5.       Finally, let’s test the function by clicking the ‘Invoke’ button highlighted in the previous screenshot.  Enter your desired year and click ‘Ok’.  I have chosen 2014.


6.       As you can see, our ‘LeagueTable’ query has completely changed.  It is showing the league table from the 14/15 season and we didn’t need to manually import this data!

7.       Before moving on, we need to remove the invoked function from the ‘Applied Steps’ menu pane.  Click the ‘X’ to achieve this.


NOTE:   There is currently no way of creating a function using the user interface.

Automated Iteration

One big limitation of Power Query is the ability to iterate through values or loop through a connection string.  Well, it is a problem if you want to write a native while or for loop.  There is a neat workaround that call a function and loops through the parameter value based on values in a lookup table.

1.       Let’s create a simple query by selecting New Source > Blank Query.


2.       Type the following code into the address bar and enter Return:


The ‘1992..2015’ syntax is Power Query’s way of auto generating all year in between those dates.

3.       Convert the query To Table and rename the column from ‘Column1’ to ‘Year’, 


4.       We now want to Add Custom Column.  A new screen will pop up, where we can add some native M.  We need to pass the ‘Year’ column from our table and the recently created function into the syntax.  Click OK to confirm. 


In effect, this is running the function against each row for the column ‘Year’.

5.       The query may take up to 5 minutes to complete, depending your Internet connection.  Once it has completed, you will see a Table link in each of the new columns rows.  Expand this column to view the data


6.       As if by magic, we can now see all Premier League seasons in one table!   We can now bring this data back into Power BI or Excel to carry further analysis.


Future Blogs

My next couple of Power Query blogs will be looking at basic data automation/configuration and some useful ETL techniques that will prepare our data for analysis. 

Scraping Website Data using Power Query

In Power BI and more specifically Power Query, there is a simple, out of the box Web connector that enables you to scrape data from a website.  This blog will show you how to achieve this in both Power BI desktop and through Excel.

Power BI Desktop

1.      Open a Power BI Desktop.  Select ‘Get Data’.


2.      In the pop up window, select ‘Other’ > ‘Web’ and click ‘Connect’.


3.      Type in the website you would like to scrape.  For this example, I have used the BBC Sport website – looking at the ‘Premier League’ football league table.  Click ‘OK’ to continue.


4.      Power Query will try and obtain the data from the website and may take a couple of minutes loading.  As long as you are connected to the Internet, the query will complete and looks like the below:


5.      I have ticked the second option, however, you could select all three datasets if preferred.  When happy with your selections, click ‘Load’.

6.      You have now imported the scraped data, but to view it in Power Query, click ‘Edit Queries’ from the top menu pane.


7.      The Query Editor will open in a separate window, with the scraped data showing in its raw state.  You can now modify/join/clean until your hearts content!



1.      Firstly, you must have Excel 2013 or later and the Power Query add in for Excel installed to be able to scrape a website using the below method.

2.      Open Excel and start a new workbook.  Then, click ‘Power Query’ from the top menu pane and select the ‘From Web’ button.


3.      Enter the URL (demonstrated in step 3 of Power BI Desktop example) and click ‘OK’.

4.      Choose your desired data set or alternatively, pick more than one by ticking the ‘Select Multiple Items’ option.  Click ‘Load’ to download the data.


5.      When loaded, you can preview the data by hovering over it – found under the ‘Workbook Queries’ menu pane.


6.      To open the data in Power Query, double click the workbook query.  A new window will open and the data available to edit – see below:



Whilst playing with the web scraping feature, I have found that some websites work better with Power Query than others.  At time of press, I cannot find any official Microsoft documentation on what elements of a website do work and more importantly, what doesn’t.  However, web data that is wrapped in an html table or around div tags seem to work better with Power Query.

What’s Next?

As I have demonstrated, scraping websites using Power Query is very easy.  This feature does not require any M or coding expertise, as Power Query works its magic under the hood.  This is a great alternative to C#, JavaScript, etc. for scraping a web API.

This article shows you how to bring the data into Power Query but in reality, it will need to be cleaned and transformed to be useful.  Excel Power Query Web Scrape - AFL Fixtures by Matt Allington and Using Power Query M Language for Scraping Any Website Data by Chris Webb are two very useful blogs demonstrating how this can be achieved.  If you are already a very capable M developer or have used Power Query in the past, you will know how powerful, yet simple this is.

Future Blogs

Look out for more blogs on M and Power Query in the future.  The next installment will be on how we can write native M to scrape websites, focusing specifically on web data that cannot be accessed through the user interface.

New Features in SQL Server 2016 – Part 3: Reporting Services

I am pleased to announce Microsoft have made some improvement to Reporting Services (SSRS), in the upcoming version of SQL Server 2016 CTP2.  With the new acquisition of Datazen and advances in Power BI, SSRS seemed to be falling behind – both in terms of functionality and look and feel.

In the Preview version of SQL Server 2016, the following features have been announced or improved:

·        Built-in R Analytics.

·        Improved DPI in Report Builder.

·        Reporting Services Subscriptions.

Within the SQL Community, there has been a lot of talk around some of the existing SSRS functionality being improved and made more modern.  As SQL Server 2016 CTP2 has only been public for a couple of weeks, not all SSRS features are yet available.  The below are expected to be in the full release, although they have yet to be officially announced.

·        New Parameter Panel, chart types and design.

·        Power View included in SSRS.

·        Mobile BI.

Improved DPI in Report Builder

DPI (Dots per Inches) on new devices make rendering reports very difficult in the current version of SSRS.  Although these devices have high DPI, this means that more pixels need to be drawn and the display can look very small.  SSRS 2016 will be able to handle high DPI, without distorting reports.

1.      Windows 8.1 DPI Scaling Enhancements.

a.      Optimizing the usability and readability of high-DPI displays.

b.     Empowering developers to optimize app-specific scaling based on display DPI.

2.      High DPI and Windows 8.1.

Reporting Services Subscriptions

The following changes have been made to the Reporting Services subscriptions, depending on native SSRS mode and SharePoint.

Native and SharePoint:

1.      Subscription description.

a.      Include a description of the report as part of the subscription properties. The description is included on the subscription summary page.

2.      Change subscription owner.

a.      Starting with the SQL Server 2016 Community Technology CTP2 release, you can change subscription owners using the user interface or script. This helps when carrying out routine tasks e.g. when users leave or change roles in your organisation.


Native ONLY:

1.      Enable and disable subscriptions.

a.      User interface options to quickly disable and enable subscriptions. Disabled subscriptions can be easily re-enabled.

2.      Shared credential for file share subscriptions. Two workflows now exist with Reporting Services file share subscriptions:

a.      Reporting Services administrator can configure a single file share account, which is used for one to many subscriptions.

b.     Configure individual subscriptions with specific credentials for the destination file share.

Built-in R Analytics

The enables Data Scientists to use R scripts to plug into SSRS reports.  R is a very powerful analytical tool that can cluster data and create powerful trending capabilities.

1.      Previously integrating R with SSRS involved an unofficial plug in (to Visual Studio) from Codeplex.  To download the add-in, click here.  I have also provided a link to Jen Underwood’s blog, found under ‘References’.

2.      New built in R component will enable Data Scientists to interact and tests R scripts in a transactional environment.

3.      Create SSRS reports referencing R scripts.


New Parameter Panel, Chart Types and Design

According to Gilbert Quevauvilliers, SSRS will undergo quite an overhaul in SQL Server 2016.  Some of the improved features include:

1.      Parameters panel will be more interactive and respond in a similar way to the Slicers in Excel.

2.      New chart types being provided, similar to the ones in Power BI.

3.      HTML to replace Silverlight, which will ensure reports can render in the most used browsers.  Rendering speeds will also increase.

Power View included in SSRS

The talk on Social Media sites (like twitter) is that Power View will be added to SSRS 2016 CTP3.  This could give the following benefits:

1.      Almost any data source will plug in to SSRS.

2.      Better, more modern chart types – improving user analysis.

3.      Potentially the ability to write native M into an SSRS report.

Mobile BI

SQL Server 2016 will support Mobile BI and data visualisation.  The devices currently supported are Windows, iOS and Android devices. 

1.      Users can now visualise and interact with data more easily, without additional charge.

2.      Ability to connect to enterprise data sources and use Active Directory for user authentication.

3.      Deliver live data updates on mobile devices and personalise data queries for different users.


For more information on all of the new SSRS SQL Server 2016 features, the below resources/blogs are highly recommended.

·        Official Microsoft Page -

·        Gilbert Quevauvilliers BI blog

·        Matt Landis Blog -

·        Jen Underwood’s R in SSRS Blog -