Jose Mendes

Jose Mendes' Blog

Data Data Revolution – The Results

This blog will take you through the Power BI Dashboard, Data Data Revolution – The Results, which is the product of the data collected from the demo presented in the last SQLBits conference (for further details, please check my previous blog http://blogs.adatis.co.uk/josemendes/post/Data-Data-Revolution).

clip_image002

This dashboard provides a breakdown on the player’s preferences and performance split by different indicators. In the following video, I’ll show some of the possible conclusions we can gather from the analysis of the data.

Data Data Revolution

Following the DISCO theme, Adatis decided to present all the SQLBits attendees with a challenge based on the game Dance Dance Revolution. At the end of the game, the players were presented with two Power BI dashboards, one that streamed the data in near real time and the other representing historical data. This blog will detail the different components used in the demo.

  SQLBits Architecture

     (High Level Architecture)

 

The starting point

The first requirement was to have a game that could run on a laptop and store the output data in a file. Based on the theme of the conference, we chose the game Stepmania 5 (https://www.stepmania.com/download/). After understanding how it worked and what type of details we wanted to capture, we adapted the program so it was possible to save the output in a TXT file every time a key was pressed. Following is an example of how the data was structured.

{"Player": "0", "Row": "768", "Direction": "Left", "NoteType": "Tap", "Rating": "OKAY", "Health": "Alive", "Combo": "0", "Score": "0", "Artist": "Katrina feat. Sunseaker", "Song": "1 - Walking On Sunshine", "Difficulty": "Easy"}

 

Capturing player details

To complement the game output, we decided to create an MVC application that had two functions, capturing the player details in an Azure SQL DB, and, upload a new Game ID along with the player details to a reference BLOB stored in an Azure Storage Container.

 

Sending the data to an Event Hub

Since we wanted to stream the data in near real time, we needed an application that could read the data from the output file as soon as it was updated. To achieve this, we built a C# application that was sending the data to an Event Hub. To make sure we didn’t upload duplicate data, we implemented a logic that compared the last row with the previous one. If they were different, the row was uploaded and if not, the program would wait for the next input.

 

Distributing the data

To distribute the data between the Azure SQL DB and the Power BI dataset, we used two separate Stream Analytics Jobs.

The first job was using the Event Hub and the reference BLOB as inputs and the Azure SQL DB as output, while the second job was using the same inputs but having a Power BI dataset as an output. Due to the dataset limitations, we ensured that all the formatting was applied in the Stream Analytics Query (eg. cast between varchar and bigint, naming conventions, …).

 

Power BI streaming datasets

In this scenario, the streaming datasets only work properly when created by the Stream Analytics Job. Any of the following actions invalidates the connection between the jobs and the dataset:

· Create the dataset in Power BI

· Change column names

· Change column types

· Disable the option Historic data analysis

When the dataset crashes, the only solution to fix the issue is to delete and re-create it. As a result, all the linked reports and dashboards are deleted.

 

Representing the data

By the time the demo was built, the connectivity of live datasets to the Power BI Desktop was not available, which means the live streaming dashboard was built using the online interface.

It is important to note that it is impossible to pin an entire page as a dashboard when using live datasets since it won’t refresh as soon as the data is transmitted. Instead, each individual element must be pinned to the dashboard, adding some visual limitations.

clip_image002 

The performance of the players could be followed by checking the dashboard streaming the results in near real time. The use of the word near was used several times in the blog because the streaming is limited not only by the internet connection but also by the Power BI concurrency and throughput constraints, meaning the results were not immediately refreshed.

The second report was built using Power BI Desktop and was connected to the Azure SQL DB.

clip_image004

At the end of the game, the players could obtain the following information:

· Who was the winner

· How did they perform during the game

· The number of hits for each rating

· Which direction they were more proficient

Design Thinking

On February 24th, I had the opportunity to present my first Half Hour Huddle on the subject of “Design Thinking”. The session followed a format where I challenged the participants to solve a problem using the methodology while I guided them through the process. The result? An assortment of different and interesting ideas, that went from a Christmas Training course for people who don’t appreciate the season; a service that would allow you to travel around the world, buy the best products of each region and deliver it in a hot air balloon to a Sponge Bob Square Pants with an inside rubber pocket that allow the user to pour in some moult wine or hot coffee.

The Introduction to Design Thinking

According to Wikipedia, Design Thinking refers to creative strategies designers utilize during the process of designing. It is also an approach that can be used to consider issues and resolve problems more broadly than within professional design practice, and has been applied in business and to social issues (https://en.wikipedia.org/wiki/Design_thinking).

In other words, Design Thinking is a methodology focused on the users’ experiences, especially their emotional ones, that create models to examine complex problems, build prototypes to explore potential solutions, test the ideas, and most importantly, tolerates failure.

clip_image002

The methodology follows 5 different stages:

Empathize – Create empathy with the user and start to build a connection

Define – Define a problem statement from the previous empathy work

Ideate – Brainstorm to get a lot of new ideas to solve the defined problem

Prototype – Build and make things

Test – Test the concepts created with the users

 

The challenge

Participate in a crash course and redesign the gift giving experience in about 40min.

For the crash course, the participants formed pairs and were told they had to redesign the gift giving experience of the partner while following the supporting material (https://static1.squarespace.com/static/57c6b79629687fde090a0fdd/t/58992ddd46c3c4da5df52911/1486433757845/Participant-Worksheet.pdf). In other words, the interviewee had to think about the last gift he/she offered and talk about the whole experience to the interviewer.

Definition of experience: Realizing you have to buy a gift to realizing you forgot to buy a gift to thinking about what you might get to purchasing it, wrapping it and offering it to the other person.

 

The 9 Steps to Achieve Success

EMPATHIZE

1. Interview

The challenge is to design something useful and meaningful to the partner, and the most important thing of designing for someone is to gain empathy for that person, which means, in this step, the interviewer will make questions that would allow him to create a connection and reach the emotions of the interviewee (eg. When was the last time you gave a gift? How did it go? What was your favorite part? Least favorite?)

2. Dig Deeper

After creating a connection, the interviewer will want to forget about the gift and find out what’s important for the interviewee. He will want to dig deeper and seek for emotions, stories and motivations, which is why, an excel file is not used in this methodology. (eg. If the interviewee said he offered a gift to the mother and feels emotional, the interviewer will want to explore the subject and ask him what’s going on with the mother, why did he felt the need to offer her a gift)

 

DEFINE

3. Capture Findings

The interviewer will synthesize the learnings into a few “needs” he discovered and a few “insights” he found interesting.

Needs – typically verbs, are actions the person is trying to achieve while offering a gift (eg. Show love, be appreciated, trying to feel important)

Insights – learnings from the partner’s feelings (eg. The interviewee offered a gift because he/she feels pleased to make the other person happy)

4. Define problem statement

Using the needs and insights, the interviewer will create a statement he’s going to address with the design, which means it has to be something actionable and doable (eg. Paul wants to reconnect with an old friend because he misses the adventures they spent together while they were young).

 

IDEATE

5. Sketch

The interviewer will sketch at least 5 radical ways to meet the interviewee needs. In this step, perfection is not needed and quantity should be more important than quality, since the interviewer will want to explore all the possibilities

6. Share the solutions and capture feedback

The interviewer will share the sketches with the interviewee and capture the feedback by making open questions, always having in consideration not to defend his ideas and convince him/her what is good or bad (eg. What did you think about this sketch? what do you think it went wrong? what is missing?)

7. Reflect and generate a new solution

The interviewer will incorporate what he learned based on the solutions and the feedback provided and will create one single sketch, that can be an improvement of something he had sketched previously or something completely new

 

PROTOTYPE

8. Build your solution

Using different art and craft materials (kitchen foil, paper clips, duct tape, balloons, plasticine, post-its, …) the interviewer will prototype the solution sketched. It should be something the interviewee can engage and react to.

 

TEST

9. Share your solution and get feedback

The interviewer will capture the feedback provided by point down what worked, what could be improved, questions and ideas the interviewee raised while testing the solution.

 

The Result

At the end of the session, the participants managed to apply the methodology in what could be a very complex experience for some users. Some great and crazy ideas were generated and who knows, if the next big thing was not born on that day?

 

More info on Design Thinking and how companies like IBM and GE are applying it in their business, just check the following links:

https://dschool.stanford.edu/resources/gear-up-how-to-kick-off-a-crash-course

https://hbr.org/2015/09/design-thinking-comes-of-age

IoT Hub, Device Explorer, Stream Analytics, Visual Studio 2015 and Power BI

As we saw in my previous blog, the IoT Hub allow us to collect millions of telemetry data and establish bi-directional communication between the devices, however, more than quantity, what we need is valuable insights that will lead to smart decisions. But how can we do that?

Collecting the data

There are thousands of sensors we can use, depending on the purpose. If we check the Microsoft documentation we will find tutorials for the Raspberry Pi, Arduino, Intel Edison or even simulators created with .Net, Java or Node.

The first step is always the creation of the IoT Hub on the Azure Portal. Next, we have to add the devices, which can either be done using C# and the IoT Hub Extension for VS 2015 or the Device Explorer. This last tool, provided by Microsoft, can easily register new devices in the IoT Hub and check the communication between the device and the cloud.

Once the devices are properly configured we will need to store the data, which can be done using a SQL Azure Database.

 

Represent the data

Now that we collected the data, we want to be able to represent it. One of the best ways to do that, is by creating some Power BI reports and dashboards, which will be populated via Stream Analytics.

A good example of a similar architecture and example dashboards can be found on Piotr’s blog Using Azure Machine Learning and Power BI to Predict Sporting Behaviour. Note that on his example, he used Event Hubs instead of the IoT Hub.

 

Insights and actions

Let’s imagine a transportation company is collecting the telemetry from a food truck equipped with speed, location, temperature and breaking sensors. In order to assist their delivery process, they have a report being refreshed with real time data that triggers some alerts when certain values are reached.

One of the operators received an alert from the temperature sensor, and after checking the dashboard he realizes the temperature is too high and it will affect the quality of the products being transported. Instead of calling the driver and make him aware of the situation, because the sensors are connected to an IoT Hub, he can simply send a command to the sensor and reduce the temperature.

 

More info:

https://github.com/Azure/azure-iot-sdks/commit/ed5b6e9b16c6a16be361436d3ecb7b3f8772e943?short_path=636ff09

https://github.com/Azure/connectthedots

https://sandervandevelde.wordpress.com/2016/02/26/iot-hub-now-available-in-europe/

https://powerbi.microsoft.com/en-us/blog/monitor-your-iot-sensors-using-power-bi/

https://blogs.msdn.microsoft.com/mvpawardprogram/2016/12/06/real-time-temperature-webapp/

Azure Event Hubs and IoT Hub

Imagine that you are the CEO of a big Logistic & Transport Company that works across the UK. In order to obtain multiple insights that will allow you to efficiently analyse how your company is performing and help you take better decisions, you decide to start collecting different telemetry information from the vehicles fleet. The question is, how will you manage to deal with hundreds of connected devices producing millions of telemetry data? The answer is….

Event Hubs

Back in 2014, Microsoft announced the release of Azure Event Hubs, a service that allows the collection of high throughput ingress of data streams generated by devices and services in an easy, secure and reliable way.

The Event Hubs can be created either through the Azure Portal or the Management API and gets immediately available without the need of further setups or management/maintenance requirements. The information stored in customer partitions will provide the message streaming. Each customer will only read a specific subset of the message stream due to the portioned customer pattern.

image

 

IoT Hub

In a very simplistic way, the Azure IoT Hub is the bridge between our devices and the cloud. It is fully manageable, enables reliable and secure device-to-cloud and cloud-to-device communication between millions of IoT devices and provides a service interface to support the application development.

image

 

Event Hubs or IoT Hub

If we consider the IoT Hub as an improvement of Event Hubs, shall we assume that the solution to the scenario described on the top of the article will be the first option? The answer is… depends on what we want to achieve.

If our needs require bidirectional communication and millions of simultaneously connected devices, IoT Hub would be the choice, however, combining the IoT Hub and Event Hubs instead of using them separately is a better option. While IoT Hub can deal with the device-to-cloud communication, the Event Hubs can deal with the huge amount of event ingresses produced by our vehicle fleet.

 

More Info:

https://azure.microsoft.com/en-us/blog/announcing-azure-event-hubs-general-availability/

https://docs.microsoft.com/en-gb/azure/event-hubs/event-hubs-overview

https://azure.microsoft.com/en-gb/documentation/learning-paths/iot-hub/

https://azure.microsoft.com/en-gb/resources/videos/azurecon-2015-overview-of-azure-iot-hub/

http://www.jamesserra.com/archive/2017/02/iot-hub-vs-event-hub/