Poor BI; still struggling with broader adoption – as outlined by Ron in the post Four BS BI Trends (And One Good One). So Gartner identifies BI as the “number one technology issue for 2007″ then immediately pulls out this old chestnut as BI Trend #1: There’s so much data, but too little insight.
Then I get this comment by Ron Patiro asking: Besides simply not being actionable, what are some of the common pitfalls and tangles of metrics that analysts get themselves into in the pursuit of engagement?
These two ideas are closely related. The “common pitfalls and tangles of metrics” are often the reason people get a “so much data, but too little insight” experience. Let’s explore these issues a bit.
The primary reason you get a “so much data, but too little insight” situation – if you have an analyst to work with the data – is indeed the actionable analysis problem, as Ron P. points out. But, there are at least 3 versions of the actionable analysis problem, one obvious and two not so obvious:
- Producing analysis that isn’t actionable at all
- Producing analysis that is valid but too complex to be actionable, and
- Failing to act correctly on a valid and easy to understand analysis
And often, I find the Root Cause of these three problems (to answer Ron P’s question) to be faulty segmentation logic. This condition in turn often is born of a situation many web analysts are familiar with by now: No Clear Objective. But let’s leave the segmentation discussion for later and examine each of three cases above.
One cause of the “too much data, no insight” experience is producing analysis that isn’t actionable at all; it’s literally worthless and cannot be acted upon. This is the most common vision of the actionable analysis problem – but probably not the one causing the majority of the negative outcomes. Analysis can be “actionable” from the analysts’ perspective, but not the business perspective. And if no actual business action takes place, no real insight is gained.
In my experience, people spend an incredible amount of time analyzing things that will never create impact. Even if the analysis produces something that looks actionable, often the execution is impractical or financially irrelevant and so is not acted upon. Just because you can “find a pattern” does not mean the business can do anything productive with that pattern. Randomly “mining for gold” is one of the biggest time wasters around, and why people are often dissatisfied with the result they get from black box data mining projects. You have to start with an actual business problem of some kind, preferably one that if solved, will increase sales or reduce costs, or no action will be taken. Otherwise, you have simply created more data to add to the “too much data” side of the problem.
The bottom line for this slice of the problem: The intent and result of the analysis might be actionable, but unless there is a clear business case for acting, you have just contributed to the actionable analysis problem. In other words, there is a difference between an analysis being “actionable” and having people actually act on it.
The 2nd slice of the “too much data, no insight” problem occurs when the analysis is too complex. In Marketing at least, complexity introduces error, and probably more importantly, hinders the explanation of the analysis to people who might take action and gain insight. If a Marketing person can’t understand the analysis, how are they going to formulate a campaign or program to address the problem, never mind get budget to act on the analysis? Please note I’m talking about the analysis, not solving the problem itself. Often, an elegantly simple analysis uncovers a problem that will be quite complex to solve. These are two different issues.
In fact, I would go as far as to say the more complex the problem is to be solved, the more elegantly simple the analysis needs to be. The reason is this: the most complex Marketing / Customer problems are usually cross-functional in nature, and to drive success in a cross-functional project, you need rock-simple analysis that galvanizes the team without a lot of second-guessing on the value of a successful outcome.
The bottom line for this slice of the problem: An analysis might be correct and even actionable, but too complex to be acted on. Complexity opens the analysis up to (often accurate) disbelief in the conclusion, action never takes place, so insight is lost.
The 3rd “too much data, no insight” problem is failure to translate a valid and easy to understand analysis into the correct action. Here, we are finally moving out of the analytics side of the problem (delivering actionable analysis) and into the Business side.
Why is there failure to act correctly? I’d submit to you it goes back to the Deconstruction of Marketing – most marketing folks simply don’t understand what to do with “people” as opposed to “Reach and Frequency”. In other words, they can’t conceptualize how to act successfully against the individual or behavioral segment level as opposed to the nameless, faceless demographic level.
In my opinion, this is the primary reason why demographics are so overused in customer analysis, especially online – the marketing folks simply can’t get out of that box, it’s where the “actionability” starts for them. The problem with this thought process, as has been pointed out, is that demographics often have little to do with behavior. Behavior predicts behavior; demographics are mostly coincidental. Yet the analyst, looking to produce a successful project, often will allow themselves to be dragged into endless demographic segmentation that is primarily a waste of time (unless you are a media site and sell demos) and leads to false conclusions, which lead to failed or inconsistent implementation.
The bottom line for this slice of the problem: the analysis identified a problem or opportunity, but in the end, the execution against the analysis was flawed and ultimately delivered poor or no real insight. By the way, I think this third form of failure to deliver insight is the most common – much more common than most people think. Why? It’s the hidden one, the one that’s not so obvious and much easier to push under the table.
So there you have it. Three versions of the “actionable analysis” problem that lead directly to the “so much data, but too little insight” issue. I think #3 is probably the most prevalent; a lot of analysis “fails” not because of poor analysis, but poor execution against the analysis.
What do you think? Have you delivered a clearly actionable analysis, one that is capable of real business impact, only to have the execution against the analysis botched?
Perhaps more importantly, were you able to do anything about the botched execution? Were you able to turn it around? How did you make that happen?
Or, is execution not really your problem – if Marketing (or whoever) screws it up, then they screw it up?Follow:
3 thoughts on “Data, Analysis, Insight”
I wouldn’t be so quick to blame the “analysis” for not being actionable. In my experience, it’s the analyst who doesn’t connect the dots to MAKE IT actionable.
Why does this happen (so often)? I don’t think it has anything to do w/ the capabilities of the analyst. I think it’s an all-too-common bias on the part of analysts to focus on, or search for, that “killer insight” — that “AHA!” moment when the stars align and the heavens part to shine a light on the great mysteries of the world.
Lest anyone think I’m pointing fingers at others, no way — been guilty of this way too many times myself.
Thanks for the follow through! I particularly like how you framed the common hang up of “faulty segmentation logic” being aided by the disconnect in marketing information (Deconstruction of Marketing ). I see a lot of truth in that. I also couldn’t agree with you more that demographics do NOT indicate online behavior.
Ron S, at the heart of your issue somewhere is the question of whose responsibility it is to make analysis actionable. Some analysts feel their role continues on into execution, others feel they’re done at the analysis – which is a shame, I think. Fortunately, this is much less common in web analytics than in BI because the analyst is often the executioner – er, person / part of the team that takes action.
In BI, often what happens is the analyst gives the Marketer “exactly what they asked for”, which often doesn’t make any sense and is headed for failure to begin with. Whether the analyst knows this and actually acts on it is a matter of culture – many have faced the “when I want your opinion I’ll ask for it” response when challenging ideas and are reluctant to comment. Others simply don’t know enough about the business to form an opinion on actionability.
Ron P, I wonder if anybody has ever studied the real accuracy of online demographic information? There’s lot’s of anecdotal stuff around, but I wonder…if I took a segment of visitors, say Male 18 – 24, I wonder how many of them would actually be Males 18 – 24? And what would that say about using demos for online segmentation?