Dan Perrin

Dan Perrin's Blog

First-class Data Analytics Service – Adatis Managed Services


Founded in 2006, Adatis has gained over 13 years of experience in delivering data & analytics platforms and are recognised as experts in Microsoft Azure Data Platform, Advanced Analytics, Power BI and SQL Server BI capabilities.

The need to harness the potential of new data sets and the everchanging range of available technologies creates the double-edged challenge to not only operate such a platform but to also evolve it, to maintain competitive advantage.

Adatis has been working with an ever-increasing number of its clients, to provide a specialist and modern Managed Service focused on the efficient operation and effective evolution of their data & analytics platforms. Ensuring platforms are not only delivered, but established with a fit for purpose service, adopted with the business benefits realised and adapted based on the on-going changing needs.

With this growing focus and a team of dedicated data & analytics support experts established in the UK and Bulgaria, Adatis Managed Services has been launched.

Adatis Managed Services

Unpredictable or Out of Control Data Analytics Platform Consumption?

We like the idea of paying for what we consume. But do you, like me, end up with a fridge of rotten peppers because you didn’t eat what you had planned? Or when it comes to the end of the evening and you pay the bar bill, have those couple of quick beers after work turned into something more than anticipated?

Data analytics is more relevant in today’s business than ever. Ever more organisations are looking to flexible and adaptive cloud-based platforms, such as Azure. To deliver services for analysing data based on agility, supporting rapid change and alongside this the use of Machine Learning and AI, with increasing levels of demand.

Whilst this has seen many organisations see a significant reduction in their capital expenditure for the delivery of such a platform. Do you have predicable and controlled operational expenditure for your consumption?

A black-box or just a financial black-hole?

Microsoft for example provide some excellent calculators to determine the scale and cost of its Azure services. This should ensure that none of this is a black-box or even a black-hole. However, your platform is unlikely to be static and may consist of multiple services with differing consumption drivers, whether that is uptime, data volumes, processing time or combinations thereof.

In addition, your approach to delivery will have changed. No longer will a repurposed server with developer edition software be enough to provide potential delivery environments. And what is the impact of testing data process flows, backloading data or leaving processing running over the weekend to hit deadlines on your consumption?

Is there a one size fits all fix?

Realistically there isn’t a one size fits all answer to capacity optimisation and consumption control, without constraining the potential of your organisation’s data. Within the Adatis Managed Service we place a robust focus on Efficient Processes, Effective Governance and Impactful Monitoring.

Efficient Processes, is based on our extensive experience of delivering and operating data analytics platforms, recognising that some of the ways of working may need to change within a consumption-based environment. Ideally this element of the service commences before the first Azure service is provisioned for development, and appropriate processes are established right from the start, such as Azure runbooks and monitoring, but it is never too late. We can provide recommendations on approaches to delivery with consumption in mind, data management and archiving strategies and change management processes to provide control without losing agility.

Effective Governance ensures that most basic factors of Azure consumption are controlled across all environments. With consideration of who, when and where (in terms of environments) services are provisioned, turned on, scaled up or hit hard. And more critically to ensure they are withdrawn, turned off, scaled down or proactively balanced.

Impactful Monitoring responds to the reality of the platform, as data volumes and user demands change. Acting on fact from monitoring provided by Azure, the Adatis Delivery Framework and optional 3rd party providers, ensures data processes are optimised, capacity headroom is adjusted up or down, usage analysed, and redundant components decommissioned. As well as identifying trends to understand the impact of necessary or expected changes.

As a Microsoft Cloud Solution Provider, we can also directly provision your Azure subscription with an approach to assuring the Azure consumption and costs.

If you’d like to understand more about the Adatis approach, and how our specialist Data Analytics Platform and AI Managed Service can help you stay in control, just drop me an email at dan.perrin@adatis.co.uk.

Online gaming is booming – and it’s all thanks to data

You only need to catch a single ad break during a Premier League match to see how crowded the online gaming market is.

In a hyper-competitive industry where loyalty is supposedly dead – if you believe this tongue-in-cheek ad from PaddyPower – many companies are focusing on new technologies and data models to get ahead.

And it’s no wonder, with tech such as mobile gaming and virtual reality turning the online gaming sector into an exceptionally lucrative industry, with estimates expecting it to hit $73 billion globally in the next five years.

The trends challenging the gaming industry will be familiar to all – the need to improve customer experiences, the push to experiment with big data, the desire for a single customer view across the business – but in an industry where the house always wins, there’s the appetite to push ahead.

For other highly competitive industries, where it’s hard to differentiate from other companies – think retail – there are several lessons to be learned from how the gaming industry is adapting.

3 ways gaming companies use data to get ahead

Here are three smart ways gaming companies are using data to stand out from the crowd:

1.      Personalise and target with real-time model scoring

Batch scoring has long been the algorithm of choice for marketers. Analyse hundreds of thousands of data points, and you’ll get a good audience segment to target – though it takes a long time to run the numbers.

But recently, gaming companies are turning to real-time model scoring to personalise gaming experiences much more closely. They can monitor who’s doing well and who’s losing, and identify the best way to keep people engaged – or even, in some cases, discourage them from playing.

Take poker, for example. In poker, the gaming company just takes a cut of the stakes pot, so it’s in their best interests if players stay at the table and commit a stake. Ideally, you want a reasonably equal balance of skills in poker, to keep everyone happy. But if there’s one ‘shark’ who keeps winning, it can quickly put others off.

Previously, algorithms would simply identify these players as VIPs, and reward them. But, as gaming companies explored their data more, they realised that these players were making others more likely to quit. Real-time model scoring now lets gaming companies identify these ‘sharks’ faster, and stop them playing with less confident or competent players.

You can also use the real-time capabilities of this model to catch a despondent player before they drop off. By identifying someone who’s losing frequently, and therefore likely to stop playing altogether, the gaming company can offer them personalised incentives to keep playing. However, there’s a delicate balance to be struck to ensure the company is still encouraging responsible play. Batch scoring could only monitor activity every 24 hours or so, but real-time model scoring means the company can keep an eye on how long someone’s been playing, and what their record’s like – so they can help the player know when to stop.

We won’t see the back of batch scoring for a long time yet – it still has its place for large-scale marketing efforts and getting a big-picture view of what’s going on in an organisation. But real-time model scoring is the future for marketers and data scientists that want to get more targeted in how they use data in their interactions with customers.

2.      Open up a productive sandbox for your data scientists

Big data is a major trend for gaming companies, and it’s vital if they want the reward-based personalisation model to work. But as they collect more data, there’s the evergreen question of “where do we store that data?” and “what do we do with it?”

For leading gaming companies – and the leaders in many other industries – that means moving on from data warehouses to data lakes. It’s the perfect way to bring all your data points together into one place. But it’s vital you don’t get carried away.

The most important thing to ensure is that your data lake doesn’t just become a sandbox for individual developers to play in, with copies of data being added to it left, right and centre because everyone’s working in silos. You need to ensure it’s productive.

To make your data a true asset to the whole organisation, it’s important to ensure all your developers and data scientists have controlled access to the data lake, and a detailed catalogue that lets everyone know which datasets are in the lake, where they are, and what they can be used for.

(I talked in more detail about data lakes and DataOps in my last blog.)

3.      Make the most of your infrastructure – and know when to branch out

Gaming companies aren’t just making big-bang investments in large-scale changes to their infrastructures and operations. Many are making small, piecemeal adjustments that improve their capabilities over time.

Optimise the infrastructure you already have to ensure you’re getting the most from investments you’ve made. Pay attention to peaks and troughs in usage, and run resource-heavy processes (such as batch scoring) during quiet periods. You can also focus on optimising and automating basic data processes to help free up your data scientists to focus on developing new models, products and personalised experiences.

Once you’ve taken your existing infrastructure as far as it can go, you might then want to consider migrating specific workloads to the cloud to make them easier to manage. There’s also the option to build out a complementary or new solution that’s tailored to your organisation’s individual data demands.

If you’d like to learn about the Adatis approach, and how our specialist Data Analytics Platform and AI Managed Service can help you get ahead in your own industry, just drop me an email at dan.perrin@adatis.co.uk.

DataOps: Analytics Revolution or Operational Burden?

2019 will bring a shift towards DataOps, with new tools empowering business users to put data to work in exciting new ways. The key will be to get the right balance of access, control and cost.

In 2018, DataOps made its debut appearance in Gartner’s data management Hype Cycle as an ‘innovation trigger’. In its December Innovation Insight paper, Gartner went on to describe DataOps as a way to “drive organizational change and predictability for using data without massive investment”.

If you trust the analysts, it looks like 2019 is going to be the year for DataOps.

I recently explored the rise of DataOps in my Big Data London recap, where it also cropped up as a key trend for 2019. But what does it mean in practical terms for organisations keen to implement it?

Data has more value when everyone can access it

Firstly, it’s useful to clarify that, while the term ‘DataOps’ suggests an extension of DevOps, it’s really its own separate evolution. Think of it more as a ‘spiritual successor’ – organisations are applying the principles of DevOps, such as agile techniques and continuous delivery, to drive transformation in the way they store, process and use their data.

As the technologies and techniques of DevOps become more firmly established in businesses – and familiar to its leaders – organisations are starting to look for ways to extend their value. A recent survey found that 73% of IT and data professionals are planning on investing in DataOps in 2019 through new resources and hires.

Because as awareness of data’s importance grows, so does awareness of its versatility. Data has value far beyond supporting planning and reporting in individual business units, and many businesses are starting to understand the strategic potential of making it available organisation-wide.

When business users have access to all of the organisation’s data, and the right tools to work with it, they can build a better understanding of what’s going on now and what the organisation could be doing next.

Data lakes make data widely accessible

That might sound like a nightmare for IT. But in our experience, far from adding to IT’s burden, DataOps is a key opportunity to take the pressure off specialist resources.

In a DataOps world, business users and analysts are empowered with technologies like PowerBI, Tableau and Python to experiment with structured and unstructured data, and uncover new insights that can drive the business forwards.

When data is made available to the business by means of a data lake and a catalogue of what the lake contains, users can prototype and put it to work: interrogating the data, visualising it, or developing models and running algorithms. They no longer have to rely on IT to do this work for them.

For example at one of our customers, a UK water utility, there was no ability for team members to be self-supporting in their use of data for reporting and data discovery. We’ve worked with them to build a data lake-based platform with reusable templates and patterns to enable faster delivery timelines. The increased pace has enabled business analysts to provide insights more rapidly, in turn improving decision-making in the organisation.

Data lakes won’t replace IT-intensive tools like data warehouses – many critical processes rely on data being stored and reported on from tightly controlled sources. But for widespread access to data, data lakes really come into their own.

Governance, cost control and forward planning will be critical

A DataOps approach can support agility and innovation – but it’s vital that businesses don’t let things get out of hand in their quest to build a more sandbox-esque data environment.

Strong governance is still essential: business users will still need to comply with company data policy and external regulations like GDPR, for example. There’s no consensus yet on where this responsibility lies – although likely to be with the Chief Data Officer, it’s up to the individual business to decide exactly what its governance framework and policies are, and who’s responsible for them.

Cost control is another essential. Like DevOps, DataOps has become synonymous with cloud, but compute requirements can quickly get expensive without careful monitoring and management. With a data lake, multiple people across the organisation may be working with the same data at the same time – racking up compute costs that can go unnoticed until it’s too late.

Finally, a successful DataOps approach means always keeping an eye to the future. Business needs and current technologies evolve continuously, so the DataOps environment must evolve too – otherwise you risk falling behind and having to make major changes again in a few years.

A managed service can help

Getting the full value from DataOps means finding the right approach for your business. And that doesn’t always mean doing everything in-house. With a managed service from an experienced third party, you can rely on external experts to help with things like building and maintaining your data lake, and monitoring and optimising your cloud compute costs – while you focus on building your DataOps specialisms internally.

If you’d like more guidance on what DataOps could look like for your organisation, Adatis offers free half-day workshops to help you focus on your business needs and which models and solutions suit you best. We are also growing our specialist Managed Services team. If you’re interested in either, don’t hesitate to email me at dan.perrin@adatis.co.uk.

3 analytics predictions for 2019, inspired by Big Data London

The buzz at Big Data London hinted at changes to come in 2019 – from new data-driven experiences to the rise of DataOps and the evolution of big data beyond cloud and open source.

If you work with data in the UK, chances are you were at November’s Big Data London event. This was the third year the show has been held, and it was by far the biggest and most wide-ranging yet.

I was there to find out what kind of challenges companies are facing in getting big data and analytics programmes embedded in the organisation, and to get a feel for how things might evolve next year.

The conversations I had, and the presentations I saw, suggest three key developments to come in 2019.

#1 Data will drive completely new customer experiences

Lots of organisations are still getting started with big data and analytics, and the business value of their fledgling initiatives may not yet be proven. Big Data London had plenty of content to inspire them to press on with building platforms that will give their business a true competitive advantage.

One of the new tracks at the event focused on using data to create unique customer experiences. One presentation stood out for me: New Nudges by Alastair Cole, Chief Innovation Officer at Andrews Aldridge.

Alastair showed how customer data coupled with machine learning can lead to the creation of “big ideas, crafted for individuals” – enabling truly personalised products (as opposed to recommendations of existing products) to be created and offered to consumers on the fly.

Few organisations are in a position to be able to do this successfully today, but many are heading in that direction. Our own work with companies like Rank and the Restaurant Group is focused on building platforms that can ingest vast amounts of customer data in real time, to allow machine learning algorithms to be applied to up-to-date data.

That kind of platform takes time, effort and expertise to set up and maintain, but Alastair Cole’s presentation provided a glimpse of the kind of unique value it can deliver.

#2 Advanced organisations will move to a “DataOps” model

Another new track for 2018 at Big Data London was “DataOps”. Just as DevOps has made software development more agile, data ops promises to do the same in 2019 for data-driven activities.

Anyone involved in analytics today will recognise the frustration of trying to meet business demand for instant intelligence, when the processes for gathering, pooling, cleaning and interrogating data come from a previous era and are tortuously slow.

Organisations that are serious about always-on insight, and about applying AI and machine learning to data in real-time, need to completely rethink the way data is handled and delivered in the organisation. DataOps, with its emphasis on responsive, agile processes, seems to hold the answer.

It’s certainly something we’re seeing our customers ask for. We presented a case study at Big Data London about our work with the Rank Group to speed up the process of pooling data, applying and updating machine learning algorithms, and delivering the insights back to the business.

Previously, it used to take around seven months for Rank to make new data feeds available to gain business insights. By building and managing an analytics platform in the cloud, and re-organising IT operations around the delivery of the resulting insights, we’ve been able to bring that process down to hours and minutes.

As AI and machine learning initiatives increasingly emerge from their R&D ivory tower and start to be embedded in the business, the reorganisation of operations to be more data-centric feels like something we’ll see a lot more of next year.

#3 Big data will evolve beyond cloud and open source

For a long time, big data has felt synonymous with the open source movement. If your organisation has a strategic commitment to Microsoft, you might have felt that events like Big Data London were not for you, and that you were missing opportunities to harness the full value of unstructured data.

But that’s changing, and fast. As a Microsoft Gold Partner, Adatis was privileged to present to the Big Data London audience some new capabilities coming to SQL Server next year.

With purely relational databases increasingly preventing companies from unlocking the full value of their data, Microsoft has made radical architecture changes to SQL Server 2019 to address the challenge. Its new big data clusters capability will enable teams to move quicker, work with a wider array of data, handle massive datasets and augment their code with open-source libraries and projects.

That’s important for another reason, too: it will make it easier to run big data projects on-premises. With many organisations preferring to keep their data inhouse, for regulatory, policy or (perhaps surprisingly) cost control reasons, the ability to pool structured and unstructured data for real-time analysis will be game-changing.

The challenge for data scientists: staying focused on delivering business impact

If Big Data London is any indication – and as one of the UK’s biggest big data conferences, it should be – then 2019 promises to be an exciting one for data-driven organisations.

The challenge for data scientists, though, will be to stay focused on delivering business impact, and not get bogged down in the nuts and bolts of operating and evolving the analytics infrastructure.

Database technologies, analytics and visualisation tools and cloud platforms all evolve fast, and keeping up with the underlying tech may not be the best use of time for a skilled data science team.

10 questions to ask now – and a half-day to focus on the way forward

If you’re currently mulling the best way to operationalise and evolve a big data, analytics or AI program, it makes sense to consider these ten questions before making a decision.

You may also welcome an opportunity to get your data science team together to think about the best way forward. Adatis would be pleased to organise a free half-day workshop to explore your objectives and current model, in order to uncover the best solution. For more information about what that would entail, email me at dan.perrin@adatis.co.uk.

A checklist of ten areas to consider if you are thinking about establishing a Data Analytics Platform or AI solution.

In recent years building a future-proof data and analytics solution and service has been rapidly rising-up the agenda of most CxOs with good reason. Based on our experience at Adatis, here are 10 questions that I think are worth considering before committing to how you operate and evolve your own Data Analytics Platform.

1. Can we attract the talent to do this in house?

Data and analytics skills are in very high demand in the market place and certainly those individuals with the latest cloud experience command a premium. Can you offer a role that offers the interest and reward to attract the best talent and do you have the time to invest in finding them?

2. How do we ensure the on-going efficiency of the platform?

With cloud technology there may be direct savings to be made by ensuring that the Data Analytics platform is optimised and remains so. Will the team have the time and knowledge to monitor and ensure your platform remains efficient and minimise your consumption costs?

3. Is it straightforward for the team to cover the critical hours of processing and operation?

The business dependency on a Data Analytics platform is increasing and the likelihood is that data is no longer just arriving in a batch, in the early hours of the morning. Can your team provide the necessary hours of cover to ensure that as a minimum, by the start of the business day everything is processed, and the platform will be trusted?

4. Can we build the scale of team required to cover all the essential skills?

The range of technologies that form a modern Data Analytics Platform can be bewildering. Will you have the scope to build a team with the breadth of skills required to operate a platform, from cloud infrastructure to data science model retraining and be able to collaborate effectively with your end-users?

5. Do we have the budget to invest in the training to ensure the team are effective?

To operate the data analytics platform and provide down to 3rd line support, the team will need a depth to their knowledge. Can you provide the team the exposure to learning and development opportunities such that they can become experts in the application of the technology and operation of the service?

6. Can we retain the knowledge of the platform efficiently or is this a potential risk?

Data Analytic Platforms are generally complicated and evolve over time. Will there be wasted effort and potentially an impact on the service in ensuring that the knowledge is shared and maintained, perhaps as individuals leave and join the team?

7. Would we benefit from having access to experts?

The technology and specifically cloud platforms are continually evolving. Are you able to keep track of the change or will you have access to experts who can provide you regular updates and provide recommendations of how they might be of value?

8. How will we continually improve the solution and service?

The team will likely be faced with a continual list of improvements that are required to ensure the solution evolves with the changing business. Will you have the processes and safe guarded time to respond and implement the required on-going changes, and will you be able to provide an impartial perspective to evaluate the service and identify ways in which it can be improved?

9. Are we confident on what the operational costs will be, and can we control these?

There may be several factors that add an unknown element to your operational costs e.g. call-out allowances, cloud consumption, recruitment fees, training costs, ad-hoc advice and guidance. Would it be beneficial to be able to fix your operational costs, potentially for several years?

10. And so, bearing in mind all of the above. Is in-house or even your preferred out-source partner likely to provide the right service for your organisation?

Every organisation and every platform is different, and one size does not fit all. So if you’d like some advice on answering these questions for your organisation please do message me and the Adatis team will be very happy to discuss.