As data warehouses become more advanced and move to the cloud, Referential Data Management is often bottom of the list. Being tied to an IaaS VM solely for Master Data Services feels like a big step in the wrong direction! In this blog, I will show you the secret of ‘app-iness with a cloud alternative which pieces together Azure and Office 365 services to deliver a beautifully mobile ready front end, coupled with a serverless, scalable and artificially intelligent back end.
Before that, let’s look at why we should move on from MDS.
Excel is easy to use, but not user friendly
Excel is on nearly every desktop in any Windows based organisation and with the Master Data Services Add-in, it puts the data well within the reach of the users. Whilst it is simple it is in no way user friendly when compared to other applications that your users may be using. Not to mention that for most this will be the only part of the solution they see! Wouldn’t it be great if there was a way to supply the same data but with an intuitive, mobile ready front end that people enjoy using?
Developers are tightly constrained
Developers like to develop, not choose options from drop down menus in a web based portal. With MDS, not only can Devs not make use of Visual Studio and a like but they are very tightly constrained by the business rules engine. At this point we should be able to make use of our preferred IDE so that we can benefit from source control, frameworks and customised business logic.
Not scalable according to modern expectations
Finally, MDS cannot scale to handle any kind of “big data”. It’s a bit of buzz word but as businesses collect more and more data, we need a data management option that can grow with that data. Due to the fact that MDS must be deployed from a server, there is no easy way to meet those big data requirements.
OK, so hopefully we are all agreed we will move on from Master Data Services but what do we move on to? In this approach the solution is actually a piecing together of a number of Azure/Office 365 services that give us an excellent front end and a robust and scalable back end. The first thing to think about is the back end. For this we should use Azure SQL DB because it offers us the scalability we need but is also a reliable PaaS option so we can maintain a cloud based architecture.
By thinking about the data it will need to contain we can create a data model that should comprise tables for user entry but also list and hierarchy tables that will supply the values for drop down lists and other data entry controls.
When creating the tables, it is good practice to be strict with datatypes and nullability as our front end can make use of this data to improve the app. Once we know our model we can map individual fields to the controls that are available within our front end which will be PowerApps.
Now we have a data model that can map to a front end we can actually build the first version of our App. For a guide on how to build an App from a data source you should follow this blog. The important things to consider here are the routes back to your data source and how to validate your data as cleanly as possible. The first consideration, regarding the route back to your data, can be solved in one of two ways depending on the complexity of the scenario. If you are looking to submit data simply back to a table then you can use the pre-configured SubmitForm() function. This will validate your data and then just push it into the table. If you need to execute a stored proc or make use of looping, conditional switching etc then you can create Flow pipelines that can take care of those requirements. In addition to containing those benefits Flow has some other great features shown below:
- Flows can send push notifications back to any device that has PowerApps installed on it. You can easily create a pattern where User A creates a new customer and this then triggers Flow to notify User B. User B can then open the notification on their device which, if configured correctly, will open his version of PowerApp with that new customer loaded.
- Flows can be triggered by updates or inserts to a SQL table, meaning that if an external system updates a record you can still get hold of that event and apply your business logic.
The second consideration, data validation, can be tricky depending on the scenario. Based on the nullability of a column in your source data, Power Apps will automatically create a required flag for that field meaning that if a user tries to submit the form they will get an error and be forced to enter the missing data; crisis averted. If, however, you have pushed the data validation logic into your Flow or a Stored proc then you cannot push that error information back to your user and prompt some action. You can configure a Response object that will be executed using the run after options within Flow but this can only provide detail back to the PowerApps designer – not the user.
The final piece of the puzzle is Azure Data Lake Analytics which is what can join our master data to any transactional data, whether that be on-prem, in an Azure SQL service (DB/DW) or stored in Flat Files within Azure Data Lake. The Architecture for this is shown below.
In this diagram, the Power App is reading and writing data to and from the SQL DB. Meanwhile we have some source data coming into our Data Lake. If you are interested in the structure of my data lake then check out Ust’s blogs on the Adatis Data Lake Framework. By using the U-SQL language (shown below) we can create an external data source for our master data and join that to the transactional flat file data, outputting the result to blob storage for further processing. (more info here)
Hopefully you can see from this brief overview of the approach that we can leave Master Data Services behind and replace it with a much better alternative that can cover all the things that MDS is missing. For any comments, feedback or questions please comment on this post of catch me on twitter @MattTheHow.