The value of little data
This week I attended a tableau showcase. I enjoy attending these events, not for information on the product, rather for information on which way the industry is moving. This event showcased several Tableau customers, namely Alberta Health Services (AHS), Groundswell Group, and Clark Builders. This is the sort of dog and pony show where the customers are providing testimonials to the product. While there was some of that, it wasn’t all the boilerplate dribble one would expect out of this type of event.
These clients emphasized process integration, the iterative nature of BI, and the journey of data discovery. The discussion in the industry is changing. It’s less focused on building large infrastructure, and more on user enablement and engagement. These organizations realize the value of placing the enabler in the business itself, rather than providing a service through IS/IT. The discussion re-frames what analytics means to an organization.
Making a profit, that’s just a result. Reaching your budget on time, that’s a result. The journey is user engagement. The journey is building this information into the business process. The journey is giving the organization the ability to adjust, make decisions, and changing behaviour.
Despite this event being a vendor showcase, the customers were selling more the philosophy than the product. They are selling continuous improvement, process refinement and change management. The tool is simply the enabler. It brings to mind a comment that was made to me recently… SAP Hana has been a wash in industry. Why? Because it doesn’t allow business to iterate quickly with the information at their disposal. It’s just a faster database. While there is still a significant market in large computational models, that’s not what 95% of organizations are after. Big data is a misnomer, there is seldom anything said about little data. Most organizations are deriving most of their insight from little data. Considering the growth in computer power, this makes what used to be large data comparatively smaller. An analyst with a decent laptop can crunch significant volumes of data with excel (powerBI, powerpivot, etc) or a tool like tableau. Think of it as the computational power in analytics being commoditized. Insight gained through quick iterations and close ties with the business cannot be commoditized. This is why the space for tools like Tableau and PowerBI will gain ground, they are cost effective and do not require infrastructure to realize their value to the organization.
With this in mind, the province of Alberta has put out an economic dashboard. The aforementioned philosophies apply to the public sector as well.
I’m not sure what problems this dashboard was meant to solve. Presumably, the intent is to quantify the “Alberta Advantage”. These indicators look like the result, and as an Albertan, they certainly don’t tell me anything about the journey. There’s a lot of information here, but there’s nothing that will influence the average Albertan. Who knows, maybe that wasn’t the goal. What’s the governments value proposition to it’s electorate?
So think about this. Based on the vote compass from the last groundbreaking provincial election, most Albertans believe we are too dependant on natural resources.
I can get a trend of GDP over the past ten years…but not by industry.
Or I can get a YoY view of what the GDP by industry is doing….
But that still doesn’t help the average citizen easily understand if we’re getting better or worse as a province. The same rules apply in the public sphere as they do in the private. How was the electorate engaged? How can the province better communicate what it’s doing well?
This is a very real application of little data in the public space, and is perfectly suited to a tool like tableau public. The time is ripe for disruption.