Adatis

Adatis BI Blogs

Adatis Lyon > Marseille Charity Ride

Hi Folks,You might be aware, but Adatis are somewhat into our cycling, but we don’t like to do it for any old reason. Therefore, after the blazing success of the 2016 London > Paris charity ride, we’re doing it again… but going even further!This time we’ll be making our way from Lyon, through Valence and Avignon, right down to the south coast at Marseille. That’s a whopping 340 kilometres!Now, we’re all riding for our own causes from a whole range of different backgrounds. I’ve chosen to help out some very smart, very generous people over at Stichting Suubi. They’re doing some great things in Uganda, providing medical care to local school children. But what’s more, they’re developing tools to harness Azure and Machine Learning to automatically track healing progress, recognise wound degeneration and all sorts! Check out TendoCare.AI for more info!Now, obviously this is where the ask comes in – I’d love it if you could help me make sure the trip is worthwhile by popping a donation in, big or small. In return, not only will I drag my lazy, unfit self along this ridiculous ride, I’ll do it on my huge cargo bike which is utterly and completely unsuitable for such a ride:For some more stats and general updates on our progress, we even have a PowerBI of our progress towards charity milestones and training fitness because yes, we are that nerdy.Once again – anything you can do to help out, we’d really appreciate it - https://fundrazr.com/suubifoundation?ref=ab_c7L2T7Thanks,Simon

Power BI with Azure Databricks for Dummies (in 15 minutes)

    Microsoft Azure Databricks is Microsoft’s Apache Spark-based platform optimised for Azure and thus integration with Power BI. It was released on 26th of February this year and is still in preview but in our recent project we decided to give it a go and explore what options would such a solution behold for an enterprise data warehouse solution. To do so we have created a cluster for our project with some notepads with a PySpark script that does all our ETL. On top of that we have decided to use Power BI to feed directly from the Databricks cluster tables to build our dataset and reports. With this blog post I am going to take you through the quick process of setting up the Databricks connection in Power BI desktop using the web app and the scheduled refresh. The prerequisites for such a task are: Azure Databricks cluster with data tables (Facts, Dimensions, etc.) An access Databricks token (not the same as Azure tokens) Power BI subscription (I’m working with Pro version)Power BI DesktopAnd that’s all – no gateways, no drivers, no batteries needed.      1. Firstly, we need to make sure we have started our cluster as this takes approximately 5-10 mins. If we have already created our token (non-expiring would be ideal) and saved it somewhere we just need to open Power BI desktop and select Get Data from the welcoming screen. If you try to filter by Spark on all the connectors options you will notice there are currently three options, we will use “Spark (Beta)” one and click Connect:     2. The next step is a bit trickier as you need the URL address of your cluster (and from now on we need it to be started). To compose the above-mentioned address open Databricks, go to Clusters and select the cluster you want to connect to. On the cluster edit page, scroll down and select the JDBC/ODBC tab. On the JDBC/ODBC tab, copy and save the JDBC URL. Construct the JDBC server address that you will use when you set up your Spark cluster connection in Power BI Desktop. Take the JDBC URL that you copied and saved in step 3 and do the following: ⦁    Replace jdbc:hive2 with https. ⦁    Remove everything in the path between the port number (443) and sql (sql/protocol…) whilst keeping the boxed text as per the image below:.In our example, the server address would be:https://eastus2.azuredatabricks.net:443/sql/protocolv1/o/5441075421675247/0221-211616-tings395or https://eastus2.azuredatabricks.net:443/sql/protocolv1/o/5441075421675247/boost (if you choose the aliased version)Once you have the address you need to input under the Server label in the dialog box of the next setup step:Please note that the Protocol must be HTTP. As per the Data Connectivity mode – both are supported so it is up to your personal / business preference. 3. The next step will be to input your login details. Please keep in mind this is although this is a bit misleading as for User name you need to input “token” and for Password – the token string value you have previously saved (or if you forgot to save it just go create a new one as I did a couple of times ) 4. After you have successfully connected (if the cluster is Online you shouldn’t experience any issues) you will be able to select the data tables you need for your report and edit the queries to your liking. It is a ‘bit’ slow when compared to other data connectors but it is still in beta and hopefully it will be much more responsive in the near future. 5. Upon configuring your dataset, creating the report masterpiece of the day and publishing it online, comes the final touch of the report development - setting up the automatic dataset refresh from the Power BI web service: And here comes the sweetest part – as Power BI and Databricks are integrated and everything is in the cloud you don’t need gateways, ODBC drivers nor anything just setup the credentials once again as per the following image: N.B. Whenever your cluster is offline you will see an error that your credentials are not valid :For this to work as expected always start your Databricks cluster first and make sure it is online before your dataset refresh schedule hits! However, there is a not-so-sweet part as well. I’d even say salty as the tears I cry whenever I need to fight this bug. Currently once you deploy your dataset over a week or two the scheduled refresh starts failing with strange errors like “Bad request” or my favourite “Invalid method name: ‘GetCrossReference’” for which I have logged a bug with Microsoft. This is manually fixed by opening your Power BI Desktop, click Edit Queries and refresh all the data tables one by one. If you get an error just try again – the Spark connector is a bit buggy and takes a while to fetch the data but once you have refreshed the data set and publish it your report will refresh smoothly (taking for granted your cluster is always online when you are refreshing the dataset) for at least a week . As a conclusion, Power BI with Databricks is a tempting new option for cloud only based solution which even in preview is pretty stable and reliable, so just give it a go and let me know if you experience any bugs or if you discover something new and exciting about this connection of Power BI.

Displaying vector data on maps in Power BI

Something I have enjoyed doing in the past is making maps, especially ones showing how various things move around London. One thing I did was take some open data on usage of London Santander bikes and passed it through the Google Waypoints API and visualised it using Mapbox:Which got me thinking, is it possible to do something similar using Power BI? Microsoft introduced the geography data type with SQL Server 2008 and more mapping providers are making versions of their tools available in Power BI, including Esri and Mapbox. From a business perspective, IoT devices are enabling the collection of greater volumes of geographical data for companies to use in their analyses.Using the GEOjson files I had previously produced for the project above, I set about importing them to a SQL DB instance in Azure as a geography data type. This involved reading the file into a variable and then iterating over the index position of each individual ride.-- load the geojson file into a SQL variable DECLARE @geojson VARCHAR(max) = (SELECT * FROM OPENROWSET (BULK 'C:\path\to\my\file', SINGLE_CLOB) as j) -- use the index position to iterate over each ride SELECT geography::STGeomFromText('LINESTRING(' +STRING_AGG(CAST(long + ' ' + lat as varchar(max)), ',') + ')',4326) FROM ( SELECT * FROM OPENJSON(@geojson, '$.features[0].geometry.coordinates' ) WITH ( long varchar(100) '$[0]' ,lat varchar(100) '$[1]' ) ) aThis results in a row per ride and visualises pretty well in SSMS. If you are familiar with the geography of London you can make out the river Thames toward the centre of the image and Regents Park towards the top left:This could be overlaid on a shape file of London or a map from another provider such as Google Maps or Mapbox.However, when you try to load the dataset into Power BI, you find that Power BI does not natively support Geography data types. There is an idea you can vote on here to get them supported: https://ideas.powerbi.com/forums/265200-power-bi-ideas/suggestions/12257955-support-sql-server-geometry-geography-data-types-iAfter a bit of research it looked like using R within Power BI could be an option. The foundation thinking of this was rather than converting the longitude and latitudes to a geography data type as above I wrote them to a table with an Id to associate each ride and let the ggmap library in R join the dots. Once you have your data loaded into Power BI, you only need a couple of lines of R code to render the map:library(ggmap) qmap("london", zoom=12, maptype="toner-lite") + geom_path(aes(x = long, y = lat, group=LineId), size = .7, data = dataset, colour="#FF69B4", lineend = "round")An important thing to note is that this R library uses the Google Maps API which could cause some privacy concerns. Also before showing this to users you would want to assign a Google Maps API token in Power BI as without it you are using a generally available connection which means your maps will fail to load a lot of the time.The resulting map displays the paths – although the sorting of each row gets mixed up somewhere in Power BI for some paths hence the horizontal lines.And it turns out that the Power BI service does not currently support the use of certain R libraries which use client-server queries over the internet meaning you cannot publish this map.So as a proof of concept it is possible to display vector paths on a map in Power BI desktop, however you will need to wait until Microsoft supports geography data types in Power BI or allows R libraries which call an external service before taking it any further.These links provided the basis for this PoC and would be a good starting point if you wanted to take things further:https://bertwagner.com/2018/01/16/importing-geojson-earthquake-data-into-sql-server/https://docs.microsoft.com/en-us/sql/t-sql/spatial-geography/ogc-methods-on-geography-instances?view=sql-server-2017http://angryanalyticsblog.azurewebsites.net/index.php/2017/02/27/power-bi-routing-visual-with-two-lines-of-r/

Conditional Formatting based on a separate measure in Power BI

What are we trying to do? I recently had a request from a client to have a traffic light indicator in a table visualisation in Power BI. Fine I thought, conditional formatting was released with the November Power BI Desktop update which will be able to handle that nicely. However, a further requirement emerged which was that this conditional formatting must be based on the difference between the values of 2 separate measures. For example, if we have a measure for ‘Sales Amount’ and a second measure for ‘Sales Amount Last Month’ we would need a traffic light indicator showing green, yellow and red if the Sales Amount was more, equal to or less than Sales Amount Last Month respectively. My initial thought for this problem was actually fairly positive, I had worked with conditional formatting in Power BI and KPIs in SSAS before so I had a plan of attack, however when I had a quick search online I couldn’t find any posts about this being done before. Luckily my initial feeling proved correct and this is very easy to achieve!   How we did it I first imported a very simple data set with ‘Date’ and ‘Sales Amount’ columns:   Next, I created a measure to calculate the Sales Amount for the previous month: Sales Amount LM = CALCULATE(SUM(Sales[Sales Amount]),PREVIOUSMONTH(Sales[Date].[Date]))   Now I needed to create a third measure which would act as my traffic light indicator. All this measure needs to do is show a value which can be used to calculate the difference between ‘Sales Amount’ and ‘Sales Amount LM’ which can in-turn be used to indicate the difference for a conditional formatting range. This of course can simply be ‘Sales Amount’ minus ‘Sales Amount LM’: Indicator = SUM(Sales[Sales Amount]) - 'Sales'[Sales Amount LM]   From here it is simply a case of applying the conditional formatting to the ‘Indicator’ measure. In case you haven’t done this before you first click on the drop-down arrow for your measure you wish to conditionally format and then select ‘Conditional formatting’:   Here are the settings I chose for ‘Background color scales’:   Here are the settings I chose for ‘Font color scales’:   After applying those settings to our ‘Indicator’ measure we arrive at the following for our table visualisation:   And that’s it! Of course, this can be used in a number of different ways if you needed a different type of value comparison or a different range for your conditional formatting but this should help get you started. As always, any comments or thoughts are very welcome.

Data Source Permissions and On-Premises Data Gateway: SQL Server and Analysis Services

In Microsoft’s documentation surrounding the On-Premises Data Gateway, the advice on permissions for the account used to authenticate the Data Source in the Power BI Service can be concerning for most, especially DBAs. In the Analysis Services section of the documentation, the advice is:The Windows account you enter must have Server Administrator permissions for the instance you are connecting to. If this account’s password is set to expire, users could get a connection error if the password isn’t updated for the data source.Server Administrator permissions…? What happened to the principle of least-privilege? In a practical sense, the On-Premises Data Gateway has to deal with two very different implementations of Analysis Services: Multidimensional and Tabular. Each are setup and configured differently from the other, and the nature of their security models are also different. As a one size fits all approach, it works. As we will soon see, the permissions do not have to be set as Server Admin The SQL section of the documentation, on the other hand, doesn’t actually specify what permissions are required for the Data Source to be established in the Power BI Service. PermissionsExactly what permissions are required for these common data sources, I hear you ask. As data sources are established at a database level, so too are the permissions set.Data SourceMinimum Permissions LevelSQL Server Databasedb_datareaderSSAS Tabular DatabaseProcess database and ReadSSAS Multidimensional DatabaseFull control (Administrator) Principle of least-permissions is now restored. Though there still are the curious incidents of Analysis Services data sources requiring permissions in addition to read. I am unsure, I have my suspicions, and have tried to find out. If you know, please leave a comment below!

Embed PowerApps into Power BI Desktop

Microsoft’s January 2018 Power BI Desktop update (blog found here) contains quite a few small visualisation features, but the one that stood out most to me is the ability to import PowerApps as a Custom Visual. My last two blogs (Part 1 and Part 2) demonstrated how to embed a PowerApp into a Power BI Dashboard in the service, but it wasn’t possible to achieve this in Power BI Desktop or a Report.  How things have changed within a month! This article illustrates how quick and easy it now is to plug an existing PowerApp into Power BI Desktop.      PowerApps Custom Visual Like all popular Custom Visuals, you can either Import from file (stored on a local machine) or Import from store. The Microsoft endorsed Custom Visuals are found in the store, which is why I would generally advise importing visuals from there.  Search for “PowerApps” and it will appear as the top result. Click ‘Add’ and the PowerApps Custom Visual is available. NOTE:   The PowerApps Custom Visual is currently in Preview, meaning its reliability cannot be 100% guaranteed.  Always use Preview tools/features/visuals with caution. PowerApp Connection I previously created an ‘SvT Adfjusted’ PowerApp, which will be used for this example.   Firstly, the visual needs to have ‘data’.  It can be an attribute or measure, but doesn’t matter when connecting to an existing PowerApp.  If you haven’t logged into the PowerApps portal, you may be promoted to do so with Power BI Desktop.  Once logged in, you will see the below screen:  If you see the ‘not supported’ error message above, do not worry – this is red herring.  Click ‘OK’ and then navigate to the appropriate Environment within the nested PowerApps portal.  ‘Adjust Target App’ resides in the ‘Callums Blog’ Environment. Click ‘Choose App’, select the PowerApp and click ‘Add’. That’s it, the PowerApp is now embedded and ready to use in Power BI Desktop. It is also possible to create a new PowerApp within Power BI Desktop, which is demonstrated in this video. The look, feel and general experience is the same as what you see in the PowerApps portal, meaning you do not even need to use a web browser to get started. PowerApps Refresh I wander how long Power BI Desktop takes to refresh when a change is made to the embedded PowerApp?  Let’s find out. Before: Principal A ‘Target’ value is changed from 850000 to 950000.   Unfortunately, nothing happens.  Power BI Desktop caches the data sourced from the Azure SQL Database, which is where the PowerApp data is stored.  The only ways to view the change is to click the Power BI ‘Refresh’ button or change the context of the visual interaction within a report.  What I mean by the latter is that you need to force the Tabular engine to write a new DAX query, which in turn, forces a query refresh.  Using the ‘Refresh’ button (below) will always be the simplest refresh method.   After: Published Report The report can now be published to the Power BI Service. The Service contains the same refresh issues as described in Power BI Desktop. Manually refreshing the web page (CTRL-F5) is currently the best approach.  It is worth noting there is between 15-30 seconds delay between a submitting a PowerApp change and viewing the new number in a visual.  This is expected, as there are a few processes (behind the scenes) that must happen first. Web Content vs. Custom Visual There are now two ways of adding a PowerApp into the Power BI Service – but which is better? Web Content -          Advantages o   Easy to dynamically add a ‘Web Content’ tile to a Dashboard. o   More flexibility around the link behavior e.g. open custom link or other report in a new tab. o   The PowerApp sits completely outside of Power BI and does not necessarily need to be seen in a report. -          Disadvantages o   PowerApp embed code is required and may not be available to all users. o   Extra layer of management and maintenance. Custom Visual -          Advantages o   No increase in the Power BI Desktop data model or file size.  The PowerApp is sourced from the cloud and acts as a Live Connection. o   One stop shop where both visuals and the PowerApp reside. o   Less places for an issue to occur.  Security can also be resolved, using Power BI Desktop as a test. o   PowerApp can be pinned as a visual - to an existing or new Dashboard.  This step is far more user friendly and easier to achieve with the Custom Visual. -          Disadvantages o   No ability to use custom links. Whilst Web Content does offer a bit more flexibility and interactivity, I would advise using the Custom Visual wherever possible.  Having one place where everything is managed, makes everyone’s lives far easier.  If you have a good business case for adopting Web Content, then of course, please do so.  References Find other recommended resources below. o   PowerApps Custom Visual Video - http://bit.ly/2Fi3vLY o   Power BI January Update - http://bit.ly/2CSfLVl o   PowerApps - http://bit.ly/2Brjys4 o   Flow - http://bit.ly/2CoL2vW o   Common Data Service - http://bit.ly/2CnXXhv Contact Me If you have any questions or want to share your experiences with the PowerApps Custom Visual, feel free to leave a comment below. Twitter:                @DataVizWhizz

Embed PowerApps into Power BI Dashboards – Part 1

Having read Matt How’s blog (found here) about PowerApps, not only did it get me interested in the technology, I also wondered how well (if at all possible) it could integrate with Power BI. Our friend Google soon told me that it was already possible to embed PowerApps into Power BI, released in the April update. However, apart from this blog by Ankit Saraf, there aren’t many professionals sharing their experiences. In addition, leveraging Direct Query mode in Power BI means we can simulate real time user input and reporting. To replicate my solution below, you will need an understanding of PowerApps, Azure SQL Database, Flow and the Common Data Service (CDS). The Further Reading section provides some good links to get you up to speed. I have broken the blog into 2 parts: -          Part 1: How Power BI visuals and PowerApps can be used together. -          Part 2: Benefits and Drawbacks of the tools/processes used. Solution I picked a typical use case that would link Power BI and PowerApps – Actual vs. Target. The Power App will be used for adjusting target values, whilst an Azure SQL Database will contain the original target and actual values. All data and Power App interaction will be embedded into a Power BI Dashboard. Create Sample Tables and Data in Azure SQL Database Create and populate two tables – dbo.SvT for static actual vs. target data and dbo.SvTAdjusted that will eventually contain the adjusted target data from the PowerApps form.             Note:     Azure SQL tables require a Primary Key column to communicate with Flow and consume CDS data. Create PowerApp Create an Environment within the PowerApps service, adding two new Connections:   1.       Connection to the CDS, using my company Microsoft account. This is where the adjusted budget values reside. 2.       Connection to the Azure SQL database, which will become the destination table to store the CDS Power App data.   The next step is to import the SQL Data from dbo.SvTAdjusted directly into a CDS PowerApp.     This automatically creates a user form containing the data. Here is where you can customise the PowerApp, such as making fields read only and configuring look and feel.     Publish the App and test and change the ‘Target’ values to test. Create Flow trigger Navigate to https://emea.flow.microsoft.com/en-us/ and login. Create a new flow, searching for ‘Common Data Service’ as the connector. Select the below and create the Flow.     Select the PowerApp CDS Entity (Adjusted Target) as source.     Add a new step (Add an Action) and search for ‘SQL Server’. Select SQL Server – Update Row as the destination and map to the dbo.SvTAdjusted table. The column data types between CDS and Azure SQL Database must match when being mapped. Save the Flow.       Create Power BI Report Create a Power BI Desktop report and connect to the Azure SQL Database. Set up the one to one relationship on ‘PrincipalID’, between the tables. Create some KPI’s and a table to compare dbo.SvT and dbo.SvTAdjusted metrics. In the below example, the ‘Adjusted Budget’ metric will change when we make changes in the CDS Power App. Embed Power App into Dashboard Publish Power BI Desktop report and pin as a live page. To embed the PowerApp into the Dashboard, add a Tile and select Web Content. The App ID can be found under Apps in the Power Apps web portal. Simply paste the App ID into [AppID].  <iframe width="98%" height="98%" src="https://web.powerapps.com/webplayer/iframeapp?hideNavBar=true&source=powerbi&screenColor=rgba(165,34,55,1)&appId=/providers/Microsoft.PowerApps/apps/AppID]   The PowerApp is added as a Dashboard tile. It is now possible to change the ‘Budget’ values.       Time to test everything works. Change the values for all three records and refresh the Power BI Dashboard. The values have changed almost instantly!     Further Reading Check out Part 2 of the blog, where I will be discussing the benefits and drawbacks I have found with using Power BI and PowerApps together. Find other recommended resources below. o   Matt How’s Blog - http://bit.ly/2CpbTYI o   Embed PowerApps into Power BI - http://bit.ly/2ywgsNX o   PowerApps - http://bit.ly/2Brjys4 o   Flow - http://bit.ly/2CoL2vW o   Common Data Service - http://bit.ly/2CnXXhv Contact Me If you have any questions or want to share your experiences with PowerApps and Power BI, feel free to leave a comment below. All scripts and workbooks are available upon request.Twitter:                @DataVizWhizz

Embed PowerApps into Power BI Dashboards – Part 2

Part 2 of this blog focuses on my experiences with PowerApps, Flow and Power BI. Part 1 was more of a demo and ‘How to’ guide, but when I read an article online, I always find known limitations, challenges or workarounds as the most interesting takeaways. Without further ado, here are my findings.   A summary of both blogs below: -          Part 1: How Power BI visuals and PowerApps can be used together. -          Part 2: Benefits and Drawbacks of the tools/processes used. Benefits -          Easy to get started. Rolling out Power Apps, Flow and Azure databases into production of course needs careful thought, but for Proof of Concept’s, Flow (2,000 runs per month) and PowerApps (for Office 365 users) are free to use. Links to the price breakdowns are provided in the Further Reading section below. -          There are a range of Wizards, Templates and GUI’s. All the tools used offer great templates for moving or inputting data and the fact barely any code is needed, makes it simple for business users. Following a couple of YouTube tutorials on each technology will get people up to speed very quickly. -          Azure technologies provide seamless integration between Microsoft tools. Whilst there are some other well-known, reputable cloud service providers around, using one product is always going to produce a slicker solution. Having less configuration steps means less chance of human error. -          Customisable features of PowerApps give the ability to mask, validate and format the PowerApp screens. It also makes the user entry a more pleasant experience, as the forms are more intuitive. Limitations -          You can only embed PowerApps into a Dashboard – as a Tile. I am not sure if moving PowerApps into a Power BI Report is on the roadmap, but I would be surprised if it was never supported. -          Power BI Dashboards are cached and not entirely real time. You can change the cache settings to 15 minutes, but the best way to ensure your visuals contain the latest Power App data is to manually refresh your page in the browser. Reports do update automatically, which makes it even more frustrating. -          Common Data Service (CDS) is a preview Data Connector in Power BI. As a result, you need to either have your environment set as ‘America’ and/or been given the beta by Microsoft. If I had access to this connector, there would have been no need to have the Azure SQL Database or Flow trigger. Milinda Vitharana’s blog shows how to enable CDS Power BI Integration. -          If you wanted to use an on-premise database instead of an Azure database, an additional step is needed. A Data Gateway (link here) must be installed to move the Power App data back into the SQL database. Therefore, I would always recommend (where possible) using PaaS or other cloud services, as they talk to each other natively. -          The error handling within the PowerApps is still quite limited. If Flow fails when updating data between PowerApps and Azure SQL Database, nothing is captured within the form itself. An Admin would need to check the Flow job or set up email alerts for user’s peace of mind.     Conclusion The initial signs look promising for Power BI and PowerApps integration. I managed to create an Actual vs Target Proof of Concept in just a matter of hours, without any real coding. There are still quite a few drawbacks and hoops to jump through to bring everything into a Power BI Dashboard, but I can only see things getting easier from this point. There are other use cases for embedding a PowerApp into Power BI, such as monitoring live sales and re-ordering stock within a PowerApp or updating product descriptions that automatically updates the Dashboard attributes. Giving someone the ability to directly interact with a Dashboard and make instant business decisions is priceless in today’s fast paced world. Further Reading Find other recommended resources below. o   PowerApps Pricing - http://bit.ly/2j5sN69 o   Flow Pricing - http://bit.ly/2kw0MFr o   Milinda Vitharana’s blog - http://bit.ly/2BfkywQ Contact Me If you have any questions or want to share your experiences with PowerApps and Power BI, feel free to leave a comment below. All scripts and workbooks are available upon request. Twitter:            @DataVizWhizz

One-way Domain Trust and Power BI

I ran into a problem setting up on-premises data gateways on a client recently, whereby they had two domains but with a one-way trust. The result was that when authenticating within the Power BI Service to retrieve data from on-premises data sources in the untrusted domain it would throw an error. At this point it is worth spending some time explaining the architecture.The ArchitectureThe architecture might be familiar to many who use Power BI and the on-premises data gateway, with a little caveat. Domain 1 is the main domain. Domain 2 is the secondary domain and trusts Domain 1. Domain 1, on the other hand, doesn’t trust Domain 2.A user in Domain 1 can access data sources in both Domain 1 and Domain 2. They can create their Power BI reports with a live connection or direct query and publish them to the Power BI Service. In order to use the reports in the service, on-premises data gateways need to be established to provide a messaging service between on-premises and the cloud. In this example, each domain has a domain controller, a tabular server and an on-premises data gateway for each tabular server.The ProblemWhen a user logged-on to the Power BI Service tries to access data from Domain 2, their credentials are passed down to the on-premises data gateway, checked against the domain controller in Domain 2 and returns an error to the Power BI Service. What I think happens is that the user (User.One@Domain1.com) will have their credentials passed down through the on-premises data gateway to the domain controller in Domain 2. Either the domain controller will not be able to find the user, it is the untrusted domain, and will not be able to pass the short name (DOMAIN1\USERONE) to the tabular server, or it tries to check with the domain controller in Domain 1 and encounters the dreaded Kerberos and cannot perform a double hop to return the short name. Either way, the result is the same in that the short name cannot be passed to the tabular server. The SolutionAs you can imagine, there are a few solutions to the problem. If it is a Kerberos related issue, then Kerberos will have to be configured separatelyMake Domain 2 a trusted domainUser mapping in Power BI ServiceThis latter approach is the one I opted for because it was guaranteed to work and would not change the current domain and network configuration.In the gateways settings in the Power BI Service, I went to the Users tab under my data source and clicked on Map user names. In there I mapped users in Domain 1 to users in Domain 2.If you have a large number of users, individual mapping might not be preferable or feasible, which is why you can replace the Domain names in part of the user string, as in example 3. This, however, does rely upon users in Domain 1 having an equivalent account in Domain 2. This is not always the case, for which the wildcard to service account would work, as shown in example 4.

Considerations for Creating a Power BI Enterprise Report Deck

Creating or re-creating an Enterprise report deck in Power BI should be reasonably straight forward given a specification, but there are a number of considerations which need to be made when building something of this nature. In the following blog post, I will detail some of these, and the route I would suggest taking. The contents of this blog revolve around more tabular reports than chart visuals, but the same themes can apply.   Fonts I think it goes without saying to keep the font consistent across both a single report, and a report deck.  The default for Power BI is Segoe UI which for the most part is pleasant, just be careful not to flick between this and Segoe UI Light as this can cause discrepancies. It is however the font size that will cause you more of an issue. The first consideration is to set a minimum size. As report developers we want the report to look as tidy as possible, and usually this means fitting everything on 1 page. The easiest way to do this is to set the font smaller if you are having space issues – but this does not always translate so well to the end user. Depending on the device, they may consider the minimum Power BI lets you set (size 8) as too small for consumption on something such as an iPad – so this is worth checking first. The second consideration is to set the font size for different elements of the report, i.e. row level data at something like a 10, and header level elements at a 12. Anything else that exists such as filter elements should be set the same as the header levels. I would usually set titles a number of points above this, at something like an 18. In general, having varying levels of font size on a single report between elements will look inconsistent so the aim here is consistency! The third consideration if possible is to keep the font size the same across all the reports within the report deck for the same type of element. Again, this adds a consistent feel to the deck. If one report has more rows than another, in my opinion its still better to use the same font size across both, rather than filling the page on both using varying sizes. The last consideration is to be careful when mixing text from a textbox and a card together in the same area of the report. Unfortunately Power BI does not currently support expressions like SSRS does, thus a textbox is for static text only. Dynamic text can only be created through a measure and assigned to a card. However having both elements side-by-side with one another does not give the expected result. The font size of the font in a text box is not the same as a card; a card size 10 is roughly equivalent to a text box size 13 (although you can only set 12-14), thus leaving you with some inconsistent fonts between elements. My suggestion is to create measures referring to elements on the report, and use them for both static/dynamic text, thus every textbox will be a card visual and have that consistent look and feel. If you only need static text, stick to text boxes.   Objects The next consideration is around the number of objects on a report – keep it simple.  Avoid building a giant monolithic report, the more objects you use, the slower the report will perform on PBI service, iPad’s and even to develop.  This is especially true for tables/matrices which will each need to fire off separate DAX queries to return the data elements. Too many objects also has knock on effects for exporting to PowerPoint as objects will overlap with one another more which may not be as much of a case within PBI service but will affect other apps. You can use the selection pane (in the view tab) so move objects above/below one another which will bring forward/push back the elements.   Alignment Another scenario which I have come across is that sometimes it is necessary to include a column header in between the header for a measure and the actual data – for instance to specify the currency or unit. There are 2 options available; the first is to set the headers of the table as white text and insert text boxes over their position. While this achieves the goal, the final look and feel is not ideal as a large proportion of time can be spent aligning the columns with the text in the text boxes, and even then it can still be pixels out of alignment. Adding/removing measures then means you have to go through the same process again as everything shifts left/right. Fortunately, in the November release of Power BI, they have added the ability to align data within the tables better. A better approach for this scenario is to rename the measures within the table visual to whichever unit you want to show for that column. The downside of this is for a developer you will then need to hover the measures to see where the original measure came from, a small annoyance which is compensated by the amount of time saved trying to do alignment within a table. Also, this means less objects in the report, and less objects will generally create a quicker, more manageable report. For anyone particularly new to Power BI, you an use the arrow keys to move around a single element pixel by pixel, to help with alignment. There’s also options on the format tab. I’m still waiting for the ability to pixel nudge multiple elements when selected together!   Colour Hopefully you should be guided in terms of colour by a corporate colour scheme. This often comprises of a set of 4 or so main RGB values to use, complimented by a further set of colours. Pick 1 (or even 2) of these from the main set of colours and use that for the report framework, either the border/frame, or report header/footer, and then use another colour for the table headers, or two if the report requires double table headers. Again, consistency is the key across reports within the report deck. If using double headers for the columns, consider using the colours as background behind the header text rather than colouring in the text in the corporate colour. Consider white text on the darker backgrounds.   Parameter Selection Most reports will contain some kind of slicer visual, to allow the user to change the context of the data – usually by period. As part of the report build, you’ll need to assess where the best position for this is on the report and to keep it consistent between reports within the deck. If your reports will require the real estate going across the page (i.e. wide tables), then consider the top of the report, else if they need the real estate going down the page (i.e. long tables), consider the right hand side. I think by default I would build it at the top, alongside any logos / titles. If you require multiple slicers, maybe move these to the side and build a panel to group them together. Another little hidden option is that of switching the slicer visual between List/Dropdown (circled red below). For some reason, list is the default but most users will agree that the dropdown is neater, and saves space. I’m not sure why this appears here rather than in the standard visual configuration tab, maybe they will move it soon? The dropdown slicer visual still has some issues which I hope will be fixed soon such as not de-selecting the dropdown after a selection has been made. Another click is required outside of the dropdown to hide the options. This is not the best for the end users, and there seems to be no viable alternative.   Header Logic Swapping Unfortunately as I mentioned previously, Power BI does not support expressions within tables, and therefore switching context based on a parameter is not easy to achieve. This is possible but it needs to be done entirely within DAX. To keep the DAX measures for this simple, consider specifying the position on the report as the name of the measure. Then within the definition of the measure, reference other created measures and keep the statement simple, allowing anyone debugging the report to trace the switching logic easily. Also use a DAX formatter such as this to make the DAX as readable as possible. It would be nice for this to be included within Power BI, hopefully it will soon! I’ve included an example DAX statement below to provide this functionality.   New Measure = IF ( HASONEVALUE('Dim'[Value]), SWITCH( VALUES('Dim'[Value]), "Comments describing the logic definition", "", "Value", [Measure], [AlternativeMeasure] ), BLANK () )   Template What does all of this lead to? The answer is a template for the report deck. As well as having guidelines for the above items which make up a report, its also good to build a physical .pbix template for your suite of reports. This way, you are not starting from scratch for each report,and you will get a more consistent feel down to the pixel level of where the objects are. Don’t over complicate the template, but leave enough elements on it to save you re-creating them each time you build a new report. I would generally avoid copying from an existing report each time to then build another report, as this will sometimes include elements like PBI defined measures, which you do not want to carry between reports. Instead define a template which you take from each time.   Conclusion Once decided on a number of these points, it is worth gaining a consensus from the product owner over whether this is acceptable to use moving forward. Do not get to the end of the deck, and demonstrate your decisions across the report set, this will leave you with far too much re-work. Instead aim to deliver maybe one of the more complex reports with a number of the items above, and then apply those decisions to the rest of the report deck.

PASS Summit 2017 – Coming Soon to the Power BI Service

I recently attended a Power BI Governance session at Pass 2017 and some new features were demoed in the Service.  I thought I would share these with you. The below have been on the Microsoft roadmap and are not strictly hot off the press. However, we were told to keep an eye on the Power BI blog site (found here) over ‘the next few weeks’ (early December 2017) – which is pretty exciting!  Without further ado, here they are: Workspace Apps ·         Selective Report Publishing.   o   Meaning you can pick and choose what reports are consumed by the end users. o   This is particularly useful if you have a combination of workspace developers and consume only users.  Self-serve analysts may be working on their own reports and they will no longer exposed to everyone within the App. ·         Install Apps automatically for existing or new users.  This can tie back to Security Groups or Distribution Lists. Collaborate with External Users ·         Directly linked to Workspace Apps. ·         Facilitates App and Dashboard sharing with personal work email accounts e.g. Gmail. ·         Uses Azure Active Directory business-to-business (AAD B2B) and incorporates Row Level Security (RLS). o   For more on AAD B2B – click here. -          UPN mapping to bridge on-premise AD to AAD and natively support the external email collaboration. Audit Logs ·         Solution templates for faster, convenient Audit Logging. Examples include: o   Author tracking – which users publish the most reports o   Gateway activity – deleted, changed, reconfigured. o   Report Lifecycle – when reports are modified, deleted and history of these activities. o   Dataset and data source tracking. My personal favourite is the ability to control what reports are packaged into a Workspace App.  This has caused me problems at various clients and made the experience of release management more convoluted and time consuming.  It will certainly please some of my customers! Further Reading I must thank Adam Wilson (Twitter handle below) for delivering a great session at Pass 2017.  All images are taken from his slide deck. ·         Adam Wilson’s Twitter handle - @AdamDWilson ·         Power BI Blog - http://bit.ly/20bcQb4 ·         AAD B2B - http://bit.ly/2mpKD7H ·         Pass 2017 Website - http://bit.ly/2xSbQC0 Contact Me I would be very interested to see what the community’s thoughts are on these announcements.  Feel free to comment or message me on Twitter if you want to discuss anything Power BI. ·         @DataVizWhizz

Power BI Solution Templates: Easy API analytics in the cloud.

SummaryEarlier this year, an article published by Forbes Online highlighted API’s as a potential driver to the digital economy. Companies like Salesforce.com generate up to 50% of their annual revenue through APIs. I won’t go into the nitty gritty of what an API is or how it works, but here is an article by the Guardian to help with that.Microsoft recently released new Power BI Template Solutions that allow ‘Newbie’ Power BI users the ability to create analytical dashboards to monitor the activity behind your company's API’s, in a fast and simple way. This blog will hopefully show you how.What is a Power BI solution template?A Power BI template is basically an incredibly simple way to quickly create enterprise ready analytical dashboards.You can use one of the currently available solution templates and have it up and running in around 30 minutes so that you and your team can quickly move onto more important tasks with the insights you’ve gained.Why use a Microsoft Power BI Solution Template?Here are a few reason as to why it may be worth your while to look into using a Power BI solution template:Quick and easy to get up and running.Several different templates to choose from with more to come.A lot of the templates are also free, however you will likely need to be set up with various Microsoft accounts in order to fully utilise some solutions.Customise your dashboard to better tell your story.Scale up as and when you need it.How do I deploy an API management solution template?Before we start, you will need to have a few things to hand.An Azure API Management instance (Link to the Microsoft Site to create an API Management Service)Power BI Desktop (latest version)Power BI Pro (For sharing with your organization)You will also need an application to monitor.Let’s start by going to Microsoft Power BI and navigate to the solutions tab. On the apps page, look for the ‘Azure API Management Analytics’ template and select ‘Get It Now’. You will be asked to sign into your Microsoft account after which you’ll be directed to the template start page.There are 7 to be follow to build your solution.Step 1 - Introduction and logonHere you will be shown the overall architecture of the API management solution.Features of the architecture:Stream API Request/Response data from API Management into Azure SQLProcess API data with Azure ML & Azure FunctionsConnect to Azure SQL and import data into Power BIAt the bottom of the page, sign into your azure account.Step 2 -  Connect to your API management service. The drop down list should list the API management service you have either already created in Azure or you can use the above example of the API calculator if you need.Step 3 - Where to store the API information.You now need to connect to a Azure SQL instance. You can either decide to create a new instance or if you have one created already, you can choose to connect to it now.The above example creates a new SQL DB instance on azure. Fill out the necessary details and check that the server name has not already been used elsewhere.Step 4 - Azure Analysis Services.Azure Analysis Services a fully managed platform as a service (PaaS), integrated with Azure data platform services. We won’t need AAS to continue so we can skip to step 8.Step 8 - Verify your template selection.Confirm you have everything setup correctly.And lastly you simply need to deploy. Step 9 - Deployment and template solution.Using the above standard setup, the whole process took about 30 mins to deploy, after which a .pbix file is created.What have you actually deployed?What does ‘deploy’ actually mean? Below is a list of the resources that are automatically generated for the solution.Azure Event HubAzure Stream AnalyticsAzure SQL (or you can use an existing instance)Azure Analysis Services (optional, additional cost, for high-scale deployments)4 Logic AppsFunction App containing 3 FunctionsAzure Machine Learning Web ServiceAll the above resources will be created under your azure account and charged to that account. This is where the template really comes into its own. All these resources are set up without you having to do anything more than sorting out a few naming conventions.What does an API management solution look like?Now that you’re all done setting up all the resources needed and you have your .pbix file ready. Open up Power BI Desktop and get your new template going.On opening up the template, you will see a host of tabs. Start by going to the ‘Cover Page’ tab and here you will be told to edit your credentials. Editing your template credentials will connect you to the Azure DB resources you have just set up. You should shortly begin to see all the API data that you have. Bear in mind this currently means seeing everything you have from the point at which all your resources were set up. There is currently no way of getting historical data.Hopefully, getting here should be pretty simple. Let’s now go through the reports available.At a Glance.A summary page of your APIs and their usage.API Calls.How long does it takes your APIs to respond. Where are these calls coming from and more specific data for you choosen APIErrors.As you can imagine, pretty important page and one you probably want to keep a close eye on.Call Frequency.This page is useful to see how often your API’s are being called. You can see when and where the loads are on your APIs. This page allows you to see when your applications may be busy and where that traffic is coming come. This is also a good page to check to see if you have Bots making continuous calls to (or spamming) your API.Relationships.This page looks at how calls to your API are related, ie there may be two calls that are constantly happening at the same time. Maybe merging these two could optimize performance of your API….Technorati Tags: Power BI,Template,Solution,Microsoft,API,management,easy,how toWhat next….. Customisable?As you can see there is a lot of information there for you to work with and all created for you through this simple to use template.Now if you know a bit of Power BI, then amend the dashboard as needed. The queries behind everything can also be amended. If you are looking at more fields that you may want to report on, then that will require a bit more work. You will need to make sure the data is captured and stored in Azure SQL DB. You will then need to update the AllRequestData to return that field. Update your model, modify a report control, add the field to Stream Analytics request query and finally modify the Global Policy. So, it can be done, but may need a little bit more know how before you start.A few things to be aware of.Currently, there is no way of including historical data so all you can see is what has happened from the moment the template is deployed.When I set up my template, I tried to set everything up as simple as possible, ie as cheap as possible and I would roughly say that it would cost $9 to $10 a day to keep all resources up and running.Currently the template does not support importing historical data. You will only see data from the point your resources were created.You will need Power BI desktop to use your template.

Generating Usage Statistics from a SSAS Tabular Cube

Once you have users accessing your cube it’s almost inevitable at some point that someone will ask you to generate usage statistics from it, and there are a number of methods to achieve this. In this quick blog post, I’ll detail them and my experiences with each, and then use this data to create a PBI report.   Native Tabular Properties The first method is natively through the tabular cube properties. This also has the added bonus (read impact) that it will optimise future aggregations based on usage – in fact that’s its main purpose. This can be done by setting the CreateQueryLogTable to true, setting up the QueryLogConnectionString (to point to the DB where the usage table requires hosting), setting the QueryLogSamping rate (10 means every 10th query will be logged), and finally the name of the QueryLog table. Advantages of this method is that its very easy to setup with limited knowledge required and it could potentially improve performance if you have an environment where users submit repetitive queries. Unfortunately there are also a number of disadvantages which led me to find other methods. Firstly, it creates a degree of overhead on the cube if its sampling too often; we actually had visible performance related complaints once we turned it on – either through the sampling or change to the “optimised” aggregations. Depending on the sampling rate, you could also find that users who rarely use the cube are not picked up as part of the stats.  As well as this any changes to the cube structure will cause the logging table to be reset. The table is also limited in terms of what it actually logs (as you can see below) – useful if you just want just the user and timestamp info but not much else, and no real ability to configure.   AS Trace To that extent, I looked for other tools to do the same task but better and I found AS Trace. Originally built for SQL Server 2012, it works fine on 2014 – and provides you the ability to run a trace against the cube activities (and log to a table) exactly like the SQL profiler but without the overhead of the GUI which adds unnecessary memory/processor power. It also runs as a windows service allowing it to restart automatically when the server reboots. If this is the case, the tool also logs the existing data to a History table and truncates the logging table. Exactly what I was after. The tool collects information based on a preconfigured Analysis Services Profiler template, which can be optimised depending on which events you are interested in. I initially ran it using most events selected, and with a limited user set it was generating in the region of 25,000 rows a day. This was clearly not maintainable for a long period of time. I then used the following blog post to understand what each event of the profiler was giving me and then just created a lightweight trace definition file to give me what I wanted. I limited it to Query Begin, Query End (for DAX/MDX statements) and Audit Logon/Logout (for session data). The setup is very straight forward, just run the install.bat as an escalated privileged account, and check it installs the service correctly. Next, add your SSAS service account to the Logon of the service, make sure the account has “Log on as Service” and membership to the database you are writing to in the form of DDL and DML access, i.e. able to create tables, write to tables – and lastly admin rights to the instance of SSAS you intend to use. Next, configure the ASTrace.exe.config file with the parameters you want the tool to use. This includes the location of the cube (can handle multiple cubes), the location of the trace definition file, the location of the DB instance and table you want to log to and lastly whether you want to preserve history on restart. The only thing I couldn’t do here, is set the schema of the table it was using to log to, which defaults to dbo. All that’s left is to start the service, and check the log file to see if it has created any errors on start-up. If not, the table should be created correctly and awaiting input. I also saw another method while researching using Extended Events (XEvents) but did not implement this once AS Trace provided me with the information I needed.   View / Power BI Report I initially used the data to run a limited set of queries to extract total users, and total queries for a given time period. This was useful to a degree but from the data collected I realised I could be doing so much more. This lead me to do some analysis across the type of metrics being logged, and allowed me to create a view on top of the tables of what I thought might be useful on a report. I removed all the redundant columns it was tracking, and created some friendly names for the EventSubclass, and other columns. I used the PATINDEX function to check the query statement for existence of some important values – while not an exact science, it would give me a good picture of the split between certain user groups and KPIs being run. I’ve included the view definition below. I ended up limiting the data to EventClass 10 as this seemed to capture all the necessary data. The only downside I have seen so far is that users querying through the Power BI web service are anonymised under the service account name. I’m currently looking into options to resolve this which I’ve seen as configuration options on Power BI – to allow through the username as long as it can be matched at the other end. SELECT RowNumber AS ID, SPID AS SessionID, CurrentTime AS DateQueried, NTUserName AS Username, CASE EventSubClass WHEN 0 THEN 'MDX Query (Excel)' WHEN 3 THEN 'DAX Query (Power BI)' WHEN 1 THEN 'METADATA Query' END AS QueryType, CASE Success WHEN 1 THEN 'Successful Query' ELSE 'Query Error' END AS SuccessfulQuery, CONVERT(DECIMAL(10,2),CONVERT(DECIMAL(18,3),CPUTime)/1000) AS CPUTimeSec, CONVERT(DECIMAL(10,2),CONVERT(DECIMAL(18,3),Duration)/1000) AS DurationSec, TextData AS Query, CASE PATINDEX('%Mexico%',TextData) WHEN 0 THEN 0 ELSE 1 END AS MexicoMarket, CASE PATINDEX('%Colombia%',TextData) WHEN 0 THEN 0 ELSE 1 END AS ColombiaMarket, CASE PATINDEX('%CS4%',TextData) WHEN 0 THEN 0 ELSE 1 END AS CS4, ServerName FROM [dbo].[ASTraceTable] WHERE EventClass = 10 Once I had the view, creating the report was relatively straight forward, and can be seen below. I included metrics for number of queries by user (blurred out) which also doubled as a filter, the % split of queries for things such as Excel/Power BI, a measure of queries by timeframe, a logarithmic scaled display for queries by query duration, and lastly a split of queries by KPI. I intend to tweak these once I receive more data from the trace, but was relatively happy with the information that they were providing. Please let me know if you have any comments.

How to Find Your Next Job with Power Apps and Flow

Both PowerApps and Flow exist within the Office 365 suite and bring enormous amounts of possibilities to mildly technical business users. No longer will Dan in IT who knows a bit of VBA be hassled to write a dodgy macro that puts some data in a database. Not only that, business users can now reach out to literally hundreds of other services that come connected straight out of the box! In this blog, I’m going to demonstrate a way we can use PowerApps to put a professional and mobile ready interface onto a Flow, allowing us to query an API and present the results back using Power BI.   Creating a PowerApp You can create a PowerApp in either the Web Portal or using PowerApps Studio (https://powerapps.microsoft.com/en-us/downloads/). I personally prefer to use Studio but both work the same, and actually all connections, Flows and custom APIs are managed through a web portal. If you have ever developed with Windows Forms then PowerApps will feel very comfortable. There isn’t a toolbox as such but you can easily drag and drop controls from the ribbon bar and all the properties live on the right-hand side. It also holds some similarities with Apples xCode in the sense that you can see all your Screens (Scenes in xCode) on the left. 1. Ribbon Bar: Here you can drag and drop a wide range of controls, galleries and media APIs onto the App design screen 2. Preview App: This button will run your App/debug. You can also use F5 3. Screen Viewer: Here you can see all the screens that make up your App 4. App Design Surface 5. Properties Window: Configure properties about the controls within your App   The Common Data Service Because we are looking at this from an Office 365 perspective we can make use of the Common Data Service, but we could also choose from any other relational data store including Oracle, MySql, SQL Server, SharePoint etc. As it says on the tin, the CDS is a generic, cloud hosted database that gives users the ability to create their own datastores and then share those throughout the organisation using AD. It also integrates very nicely with PowerApps and Flow meaning we can avoid any SQL DDL or Stored Procedures. Out of the box you get a range of standard tables that cover off a variety of business needs but you can also create custom entities that can tailor the CDS to your specific needs. Here’s an example of an entity I created in CDS to use as the main datastore for my App. 1. Ribbon Bar: New fields, Import/Export, Settings and Delete 2. Tab Bar: Fields and Keys. Preview Data within table 3. Custom Fields: Showing data types, Nullability and Cardinality 4. Standard Fields: Audit fields e.g. Created by / Created on   Developing a PowerApp One of the best features of PowerApps is that it is very smart with metadata, we simply need to point it at a table and PowerApps can use that to make decisions on how to construct your App in a way that suits the C.R.U.D. needs of your datastore. By creating the app from the custom CDS entity, PowerApps will know that you need a browse screen, a details screen and a new/edit record screen. Better yet, PowerApps will create and populate a form control with all of the custom fields ready to be populated. Based on the fields configuration it can auto create mandatory flags, error handling and hint text. You may question whether PowerApps has some limitations due to not having a code editor, whilst I’m sure some will find this to be true, I am yet to be disappointed. Instead of code, PowerApps uses Excel like functions and context variables which will feel very intuitive to any excel user. Context variables get stored at App level and can be called and updated from anywhere within your App. When creating the App, you can choose from a range of controls including Power BI tiles, Star Ratings, PDF viewers, Import/Export, the list goes on. Additionally, the gallery options mean you can display data or images in a real variety of ways. Above all that though is the integration with the devices media capabilities that make PowerApps a really cool product for non-coders. With PowerApps you can take and save pictures, Play and record video/audio and even scan barcodes. I’ve made a few basic changes to my App that you can see below but even if you hit F5 and previewed your app straight after creating it, you could successfully view, edit and input data to the database. So far I have written no code and Dan in IT is now free to go back to work. 1. Quick Actions: PowerApps has automatically created these quick actions to submit or close the form 2. Mandatory Indicator: Depending on the “Required” Property in the CDS 3. Text Box: In New mode will be blank, In Edit mode will show data. Can also show hint text and error messages if input is invalid. 4. Star Rating Control: I swapped a standard integer input with a star rating to make the App more user friendly.   Creating a Flow By default a newly built app is configured to write data back to the datastore by using a SubmitForm() function. These functions are handy for a lot of things as they take care of resetting the form after submission but also setting the form to Edit or New mode. If we want to do anything more than this – avoiding code – then we need to start looking at Flow. Flow can do an awful lot – just look at the pre-built templates for some ideas, but I’m going to use it to call the Glassdoor API to get job progression information. To create a Flow, you need to start with a trigger. The same goes for Logic Apps only, with Flow, you can trigger the process from a button press within your PowerApp. From then on you can create either actions, loops, branches, conditional logic and constraints in order to connect up any number of systems. 1. Trigger: Trigger point that is called from PowerApps 2. Initialize Variable: Passes a parameter from PowerApps into a variable to be used within the Flow 3. HTTP: Uses HTTP GET method to call the Glassdoor Job Progression API 4. Parse JSON: Parses the JSON response from Glassdoor and provides results in the form of variables 5. Email on Failure: By using the Run After feature I have configured an email notification if the Glassdoor API call fails 6. For Each Loop: Iterates over the JSON results and writes each set of variables to the database. At the moment I am using SQL so I can feed Power BI, the PowerApps team are working on deploying the CDS connector for Power BI to the UK in the coming months The formula that is used to call the Flow from PowerApps look like this: GetFutureJobs.Run(Occupation); Navigate(Results, ScreenTransition.None, {CurrentJob: Occupation}) In here there are 2 functions. The first (GetFutureJobs.Run(Occupation)) is the function to execute a Flow. Anything within the brackets will be passed into the Flow and can be used at any point within your process. In this case I pass in the users current job and use that to search Glassdoor for potential next jobs. Next is the Navigate function. This is a common occurrence in PowerApps and is used to take the user to the results screen. The first parameter is the target screen, Results. The second tells PowerApps how to transition between screens and the final array (the bit between these {}) is a list of parameters that can be passed into the next screen.   Implementing a Power BI tile The final step for my App is to analyse the results from Glassdoor using a Power BI tile. By creating a simple report and dashboard my PowerApp now has a fully functioning Power BI tile that will refresh on the same schedule as the main Power BI report within the service.   Hopefully from this blog you can see how powerful these two services can be when paired together but also how accessible these tools are now. The fact that I can have a working mobile app within minutes is somewhat revolutionary. I can certainly see a load of opportunities for these to be used and I encourage anyone reading this to have a play and unleash the POWER!

Connecting Power BI to Hive

On a recent project I was tasked with importing data into Power BI from a Hive table. For those of you who are new to Azure or Big Data, Hive is a data warehousing infrastructure for Hadoop which sits in the HDInsight stack on Azure. The primary purpose of Hive is to provide data summarisation, query and analysis for big data sets. In this blog I’m going to take you through the steps and note any Gotchas so that you can connect to Hive using Power BI. Connecting to HiveAs Hive is part of the Azure HDInsight stack it would be tempting to select the HDInsight or Hadoop connector when you’re getting data. However, note HDFS in brackets beside the Azure HDInsight and Hadoop File options as this means that you’ll be connecting to the underlying data store, which can be Azure Data Lake Store or Azure Blob Storage – both of which use HDFS architectures. But this doesn’t help when you want to access a Hive table. In order to access a Hive table you will first of all need to install the Hive ODBC driver from Microsoft. Once you’ve downloaded and installed the driver you’ll be able to make your connection to Hive using the ODBC connector in PowerBI.You will need to input a connection string to connect even though it says optional. The format of the connection string is as follows:Driver={Microsoft Hive ODBC Driver};Host=hdinsightclustername.azurehdinsight.net;Port=443;Schema=default; RowsFetchedPerBlock=10000; HiveServerType=2; AuthMech=6; DefaultStringColumnLength=200;One the next screen you’ll be asked to enter a username and password. The credentials used here are not what you use to access Azure but the credentials you created when you set up the HDInsight cluster and use to login to the cluster. Click connect and you’ll be able to pull through the tables you need into Power BI. Or, if you want to be selective in what is returned, you can write a HiveQL query in the ODBC dialog. It’s also worth noting that at the moment it’s only possible to do an import of Hive Data in Power BI and not perform Direct Query, so if your data set is huge you’ll want to summarise the data or be really selective in what is returned first.

Data Data Revolution – The Results

This blog will take you through the Power BI Dashboard, Data Data Revolution – The Results, which is the product of the data collected from the demo presented in the last SQLBits conference (for further details, please check my previous blog http://blogs.adatis.co.uk/josemendes/post/Data-Data-Revolution). This dashboard provides a breakdown on the player’s preferences and performance split by different indicators. In the following video, I’ll show some of the possible conclusions we can gather from the analysis of the data.

Data Data Revolution

Following the DISCO theme, Adatis decided to present all the SQLBits attendees with a challenge based on the game Dance Dance Revolution. At the end of the game, the players were presented with two Power BI dashboards, one that streamed the data in near real time and the other representing historical data. This blog will detail the different components used in the demo.        (High Level Architecture)   The starting point The first requirement was to have a game that could run on a laptop and store the output data in a file. Based on the theme of the conference, we chose the game Stepmania 5 (https://www.stepmania.com/download/). After understanding how it worked and what type of details we wanted to capture, we adapted the program so it was possible to save the output in a TXT file every time a key was pressed. Following is an example of how the data was structured. {"Player": "0", "Row": "768", "Direction": "Left", "NoteType": "Tap", "Rating": "OKAY", "Health": "Alive", "Combo": "0", "Score": "0", "Artist": "Katrina feat. Sunseaker", "Song": "1 - Walking On Sunshine", "Difficulty": "Easy"}   Capturing player details To complement the game output, we decided to create an MVC application that had two functions, capturing the player details in an Azure SQL DB, and, upload a new Game ID along with the player details to a reference BLOB stored in an Azure Storage Container.   Sending the data to an Event Hub Since we wanted to stream the data in near real time, we needed an application that could read the data from the output file as soon as it was updated. To achieve this, we built a C# application that was sending the data to an Event Hub. To make sure we didn’t upload duplicate data, we implemented a logic that compared the last row with the previous one. If they were different, the row was uploaded and if not, the program would wait for the next input.   Distributing the data To distribute the data between the Azure SQL DB and the Power BI dataset, we used two separate Stream Analytics Jobs. The first job was using the Event Hub and the reference BLOB as inputs and the Azure SQL DB as output, while the second job was using the same inputs but having a Power BI dataset as an output. Due to the dataset limitations, we ensured that all the formatting was applied in the Stream Analytics Query (eg. cast between varchar and bigint, naming conventions, …).   Power BI streaming datasets In this scenario, the streaming datasets only work properly when created by the Stream Analytics Job. Any of the following actions invalidates the connection between the jobs and the dataset: · Create the dataset in Power BI · Change column names · Change column types · Disable the option Historic data analysis When the dataset crashes, the only solution to fix the issue is to delete and re-create it. As a result, all the linked reports and dashboards are deleted.   Representing the data By the time the demo was built, the connectivity of live datasets to the Power BI Desktop was not available, which means the live streaming dashboard was built using the online interface. It is important to note that it is impossible to pin an entire page as a dashboard when using live datasets since it won’t refresh as soon as the data is transmitted. Instead, each individual element must be pinned to the dashboard, adding some visual limitations.   The performance of the players could be followed by checking the dashboard streaming the results in near real time. The use of the word near was used several times in the blog because the streaming is limited not only by the internet connection but also by the Power BI concurrency and throughput constraints, meaning the results were not immediately refreshed. The second report was built using Power BI Desktop and was connected to the Azure SQL DB. At the end of the game, the players could obtain the following information: · Who was the winner · How did they perform during the game · The number of hits for each rating · Which direction they were more proficient

Power BI Mobile Feature Review

In March Microsoft released Deep Touch integration for iOS amongst several other improvements to the Power BI Mobile application. This blog will look at a few of those features and examine some of the areas that still need work. So far it seems Microsoft are doing a pretty good job of developing Power BI for the mobile platform and this is most apparent when they exploit some of the built-in functionality that make mobiles so handy! One of the best features of the latest iOS is 3D Touch integration and Power BI has fully grasped this bull by the horns. Using a Deep Touch, you can launch a pop-up menu offering some of the most useful features such as search and notifications but also quick access to your recently accessed dashboards.     Another big issue that the Power BI team have tackled head on is how to make rich visualisations mobile optimised. For this they have two solutions, the first being the desktop and mobile view options within Power BI desktop. Desktop view of Report Mobile view of Report  The mobile view essentially de-constructs your report and lets you drag and drop your visualisations into the mobile template. By default, this view will always be displayed when viewing the report on a mobile device unless you rotate the device into landscape mode in which case the desktop version loads. I have mixed feelings about this feature. On the one hand, I like that I can see both views but if the report were to remain in mobile view but expand to fill the horizontal space as well, this could open up a lot more opportunities for mobile reporting.  However, despite gaining some control in how the report looks on a mobile device there are some pretty major limitations for the time being. Firstly, you cannot specify different properties, such as text size, for the desktop and mobile view. This means that you would need to consider both views when creating a report that will be both mobile and desktop otherwise your visual will be sacrificed as seen above in the deliveries and fragile items cards. Another drawback is that each visual element has to be snapped to the prescribed grid and this includes the transparent grab handle/border that is used to select the visual. This border is half a grid square tall in the mobile view which means that you get a lot of white space, something you want to reduce in a mobile view.  Finally, visuals cannot be overlaid. Where I have circles around some of my cards in the desktop view, this is not possible in the mobile view.  Whilst you can add an image or logo you could not make use of any backgrounds whether they be an image or just a flat colour. Thankfully, all custom visuals will work in mobile view and any cross filtering, auto play or auto zoom features (maps) are preserved perfectly. Microsoft’s second solution is focussed around dashboards. From within the Power BI service you can arrange your pinned visuals into a mobile dashboard by switching the view as shown below.   However, the best part is that it if you access a dashboard that doesn’t already have a mobile view then the app will automatically optimise the visuals into a mobile view so you don’t have to! One of Power BI’s most notable features is Q&A – a method to query your data using natural language. Using a recognisable messenger format this feature is really well replicated in the mobile app and adds another layer of accessibility for non-techie, non-deskbound users.     A relatively new feature to Power BI is the ability to add custom links to a dashboard visual. This can be utilised quite nicely in the mobile app as it will make use of the deep linking technology in iOS so that I can launch a relevant app instead of just a web page. Here I have set a google maps URL as the custom URL for the map visual. Clicking on this in the mobile app launches my google maps app, not a webpage!    Overall I can see that the Power BI mobile team have not tried to just recreate the desktop version but have embraced the mobile platform and have made use of existing features within iOS to present a feature rich app that has a familiar feel to it. Whilst there are some limitations, my next blog will look at how to create a mobile optimised report right from the start so that your users can benefit from mobile BI straight away!

Hierarchy Navigation In Power BI

Until recently, the overall functionality of the Matrix Visual in Power BI has been limited. However, this all changed when Microsoft announced the Power BI March update, which gives users access to the Matrix Preview. This can currently be used alongside the old matrix. In this blog, I will be comparing the old Matrix and the new Matrix Preview. The updates featured in the latter are as follows: Performance improvements Drilling into hierarchies Stepped layout Cross highlighting and filtering from matrix headers and cell contents This article will only focus on the drilling into hierarchies’ functionality. Click here to find more information around the Matrix updates, along with the extra features not covered in the blog. Hierarchy Drill Through One of the visible changes in the new matrix preview it is the ability to show all the headers collapsed by default making the user experience easier when dealing with large datasets. The image below shows this new feature compared to the old and new Matrix. There is also the capability to show both levels of the hierarchy simultaneously , which is again done using the hierarchy navigation buttons as illustrated in the image below. You can also drill up and down on individual columns using the right click function as shown in the image below. The benefit of this is that it gives the user a more detailed drill down of a desired column. Further drill down options are available, for example, the ability to drill down on row category headers. In normal mode (without drill mode turned on), other datasets in other row category headers will be faintly visible. By turning on the drill down mode it allows users to works on a specific category row header in isolation. The following images show the differences in the two views. Conclusion The Matrix Preview has brought about interesting and useful tools making it more interactive. The ability to be able to drill up and down within a report particularly stands out for me. It is also worth mentioning that other features, not covered in this blog give users increased customisation when working on reports – showing how impressive the Matrix Preview is.  April`s Power BI update includes more features for the Matrix Preview. My next blog will be looking at the following two features added for Matrix Preview: Rename axis titles New matrix visual enhancements: column sorting, column resizing, and word wrap   Further Reading (Power BI Blogs) https://powerbi.microsoft.com/en-us/blog/power-bi-desktop-march-feature-summary/#matrix https://powerpivotpro.com/2017/03/two-great-new-power-bi-features/

IoT Hub, Device Explorer, Stream Analytics, Visual Studio 2015 and Power BI

As we saw in my previous blog, the IoT Hub allow us to collect millions of telemetry data and establish bi-directional communication between the devices, however, more than quantity, what we need is valuable insights that will lead to smart decisions. But how can we do that? Collecting the data There are thousands of sensors we can use, depending on the purpose. If we check the Microsoft documentation we will find tutorials for the Raspberry Pi, Arduino, Intel Edison or even simulators created with .Net, Java or Node. The first step is always the creation of the IoT Hub on the Azure Portal. Next, we have to add the devices, which can either be done using C# and the IoT Hub Extension for VS 2015 or the Device Explorer. This last tool, provided by Microsoft, can easily register new devices in the IoT Hub and check the communication between the device and the cloud. Once the devices are properly configured we will need to store the data, which can be done using a SQL Azure Database.   Represent the data Now that we collected the data, we want to be able to represent it. One of the best ways to do that, is by creating some Power BI reports and dashboards, which will be populated via Stream Analytics. A good example of a similar architecture and example dashboards can be found on Piotr’s blog Using Azure Machine Learning and Power BI to Predict Sporting Behaviour. Note that on his example, he used Event Hubs instead of the IoT Hub.   Insights and actions Let’s imagine a transportation company is collecting the telemetry from a food truck equipped with speed, location, temperature and breaking sensors. In order to assist their delivery process, they have a report being refreshed with real time data that triggers some alerts when certain values are reached. One of the operators received an alert from the temperature sensor, and after checking the dashboard he realizes the temperature is too high and it will affect the quality of the products being transported. Instead of calling the driver and make him aware of the situation, because the sensors are connected to an IoT Hub, he can simply send a command to the sensor and reduce the temperature.   More info: https://github.com/Azure/azure-iot-sdks/commit/ed5b6e9b16c6a16be361436d3ecb7b3f8772e943?short_path=636ff09 https://github.com/Azure/connectthedots https://sandervandevelde.wordpress.com/2016/02/26/iot-hub-now-available-in-europe/ https://powerbi.microsoft.com/en-us/blog/monitor-your-iot-sensors-using-power-bi/ https://blogs.msdn.microsoft.com/mvpawardprogram/2016/12/06/real-time-temperature-webapp/