A key aspect to delivering data analytics and visualization is the performance of your solution; performance can easily make or break a project, however, too often, we focus on a few narrow aspects. In this article, I want to share my three main performance considerations when planning and delivering business intelligence and data analytics solutions.

Note: this is not an article that is not focused on Tableau, my experiences are not solely Tableau, but I hope you will find this an interesting read.

1. Time-to-Deliver

Times change and technology is evolving at a freighting pace, and with that, so do our challenges. Client expectations around Time-to-Deliver are one of the things that have rapidly increased in the past decade. I come from an age where the time between a consumer requesting and receiving data could be weeks, months, or not at all, and, to add insult to injury, the request came with a time-consuming list of documents to be filled in and vague processes to be followed.

We live in a new age now, with short tweets, micro-articles, and self-service no longer a hard to achieve luxury, but a must-have for any organisations that strive to become data-driven. Gone are the days where each request comes with a 10-step process from data sourcing to report development, and here are the days where we can store near-infinite amounts of data, pre-process and optimize data, and have tools to make data a single click away. All provided we set up our data landscape appropriately, of course.

For your current project, ask yourself:

  • What is the time between your end-users requesting information and receiving an answer? This is your Time-to-Deliver.
  • How many times has a long time-to-deliver played a major factor in customers accepting your solution?

As a consultant, a major factor here is to understand the data landscape and data Capabilities (people, processes and tools). So let us break down this down our capabilities.

  • Tools – Do you have the right tools in place to solve your business challenges? No technology is a silver bullet, but with the technology that you have, are you maximizing the performance of the tools that you have in place. For example, if you are using Hadoop, are you using an optimization layer? If you using a relational database, have you tuning your schema? Are you maximizing the investment you have, for example, if you are using SQL Server, have you thought about SSAS for performance?
  • Processes – What are the processes in place for delivering information to those who have requested it? I normally see two ends of the spectrum. On one side, the data consumers will have to follow a long process for documenting their requirements before it is sent to a delivery team, this creates a serious lag. On the other side, in the world of Tableau, I have seen the development of data sources where consumers can then service their own requirements. This is all about where you are on the data democratization journey.
  • People – Something which is thankfully becoming a focal point with business intelligence, data analytics and data democratization solutions is a drive to raise awareness around data. This is something that you can drive, and thus understanding the skills in your organisations, training requirements and you can improve awareness is essential to improving your Time-to-Deliver. Too often end-users blame the tools as opposed to learning the tools, so training is important for adoption.

I have spent the majority of my career working across the data stack; from acquisition to storage, from data sources to end-users, and have always focused on improving my skills in each of these areas. Given that we live in a world where data volumes are increasing and expectations for Time-to-Deliver is shortening, so it is important to take this into consideration.

2. Time-to-Insight

This is where we look at the designs and, more importantly, measure the time between opening a dashboard (or report) and seeing insights; on my travels through various clients, I see far too often that outputs created for end-users are not created to deliver insights, but to deliver data. This is something that I push for. End-users having to go through various different menus, clicks, browsing around before they see their insights is really not a good thing.

When creating outputs for end users it is not about presenting KPIs, but ask them why they will open the dashboard, what questions do they have, and what they expect to see? This is a part of requirements capture, and far too often, it is left out in the world of fast and self-service data analytics, but so important it is…

For your current project, ask yourself:

  • Do you know all the questions that your outputs are designed to serve? Have these been documented and thoroughly reviewed? A lot of the projects that I have seen involves specifying your facts (measures) and dimensions, but not the questions. For example, you have a sales over time chart which shows the last 5 years; this is great and serves a purpose, but what if my question is what is my year on year growth? it is important to understand the question and not just the KPIs.
  • How many actions does the data consumer need to perform to get an answer to their question? This is something that has made me pull out my hair more than once. Picture this, I am reviewing a dashboard and I ask a simple question to my developer, the developer smiles and responds “of course we can get this information, firstly, you need to open the menu panel here, then you need to set the following parameters (there were 5 that needed to be set), then you need click on this part of the dashboard, here you can see that this filters the bottom visuals, then we can click on the visual on the left to filter further, and there you go, you now have your answer”. I sighed, and responded that I counted 10 steps (give or take) to find the answer, do you think the data consumer will remember these steps, or just give up? Think about this.
  • How long does it take to get to the insight that you want? I use quite a simple formula which is the number of steps multiplied by the time per step. Simple right. The main thing here is to give your data consumers minimal frustration and no reason to doubt what you have produced.

Time is a precious commodity, and while you obviously want to have the information in front of you in a timely manner (Time-to-Deliver, which is a data engineering task), and you also want to ensure that the time to get insights (a visual design task) is also taken into account; working on both of these will be key to your project success.

3. Time-to-Action

The Time-to-Action is something that far too often is missing, and something that I have been increasingly focused on in my career. The Time- to-Action, for me, is the reason why Business Intelligence and Data Analytics projects exist. There is no point in having information if there is no action attached to it, or if there is an action, said action may take too long to perform.

Let me give you a real-world example, imagine a dashboard that is targeted towards your sales workforce and one that give a list of tasks that needed to be performed each morning. This sounds pretty simple, an automated task list leveraging a calendar and some insights to make sure that nothing slips through the net; the dashboard has insights, actions, and the list of quite comprehensive, so is pretty useful.

Now imagine when someone views the dashboard, and see a task to email a client about renewing their subscription. they then click on the email address and it opens Outlook, great, but now they have to go back and forth between the dashboard and his email to double-check that he wrote the numbers down correctly; being cautious, a total of 15 minutes to make sure the email is completed before sending. I am sure that you have seen instances like this in your various operations.

Now, we know that the consumer needs to send an email, and we know what content should go into the email, so why not open a Link to Outlook and embed all the required information there? If we did that, the user won’t be making sure that he copied the information across, but double-checking his message before sending. This is a simple, but hopefully illustrative example of how we can improve the Time-to-Action.

On your current project, ask yourself:

  • Do you know the actions for each of your dashboards? This is about looking beyond the delivery of information and supporting actions, and this is where we need to focus on more and more.
  • Can you support the various actions? If you take our previous email example, can you give more to support the various actions? This can save plenty of time or, better yet, give more time to those who are making decisions.

When looking at producing dashboards or reports, always think about the action and the Time-to-Action. The more you can reduce this, the better off you will be and the happier your clients will be.

Summary

I hope you have enjoyed reading this article as much as I have written it; I will be sharing more of my thoughts with you all, but this one is of particular interest to me.

Always remember to think about the three Ts, the Time-to-Deliver, Time-to-Insight and Time-to-Action, and work diligently to reduce these as much as you can as this will benefit everyone involved.

Lastly, please let me know what you think, share your thoughts below, or reach out to me on my various social media platforms. I would love to discuss.

2 COMMENTS

  1. Great article! Even though you’re clear about “this is not an article that is not focused on Tableau” I’m still surprised about recommending SSAS. If we would use Tableau with SSAS as a source (well, the website name implies that) you should be aware about the loss of functionality and even performance depending on the cubes structured and size. My recommendation is always try avoid SSAS with Tableau, if not possible then keep the complexity at a minimum in both the cubes and in the dashboards.

    • Hi Janne, thank you for your comment.

      The reason I mentioned SSAS (this was just as an example) is because it is one of the best ways to leverage the power of a SQL Server. While it is not perfect for all occasions, if you are using Excel, Power BI, or other tools, this should be considered as a serious option for optimisation.

      As mentioned, this is not just an article about Tableau, and while Tableau suffers a loss of functionality when using SSAS, there are many use cases for using SSAS, for example, if you wanted to use a live connection over a large volume of data sitting on SQL Server, SSAS is probably the way to go here. I have used SSAS with Tableau in many instances (clients choice normally), and while you are right in that it becomes a little more complex as you cannot use certain formulas, this can normally overcome that by building them directly into the OLAP cube, if you optimise your Cube for Tableau, it can work well.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.