Analyze, Not Justify

Does this issue affect the Web Analytics Maturity Model?

A conference call with a Potential Client last week jogged my memory on a couple of events that happened during the flurry of Web Analytics conferences this Spring.  Here’s a portion of the call…

PC: “We’ve tried proving the profitability of our Marketing efforts and can’t seem to get the numbers working correctly.  So Jim, what we’d like you to do is take all this data we have, and justify the Marketing decisions we’ve made by proving out the ROI.”

Jim: “I’m sorry, did you say justify?  To me, justify means “find a way to prove it works”.  Is that what you are asking me to do?  Wouldn’t it be more beneficial to analyze the results, and then optimize your Marketing based on these results?”

PC: “Jim, around here we’re pretty clear our Marketing works, and Management knows this.  But Finance is asking for some backup, some numbers to justify the spend, not to analyze it.  We don’t need analysis, we need your ‘expert credibility’ to help us out with this.”

Jim: “I see,” thinking this is not a job I’m going to enjoy.  It’s the old ‘buy an outside expert’ routine, which I detest.

PC: “Jim, the team is united behind this mission, are you on board?”

Jim: “Well, perhaps I could be on board, as long as what you want is an analysis, which may also justify the decisions you have made.  But it might not, so I just want to be clear on what…”

PC: “You  know what Jim?  I don’t feel we’re going to have a fit here, I’m getting you’re not a team player.  Thanks for your time”.  CLICK

Sigh.  I’m actually grateful they hung up, I really dislike explaining to people why I won’t work with them.

And that’s when I remember one of the most interesting moments for me during the WA conference season happened at webtrends Engage.

I was on the “Socialization of Data” panel with a great crew composed of  @jacqueswarren, @anilbatra, @johnlovett, and @bgassman.  We were talking about what web analytics might look like organizationally in the future.  Specifically, we were discussing the “Center of Excellence” concept, where all the senior analysts report to the same person, and this person typically reports directly to the C-Level.

There are many reasons this idea is a good one, but the one I often stress is relieving the pressure, explicit or implicit, on an analyst to produce a certain result from their work.  In other words, to “justify” a program rather than analyze it and get at the truth.

I said something like, “You really don’t want analysts reporting organizationally to the group they are responsible for analyzing.  This set-up tends to create a lot of pressure on the analyst to prove a program is working by torturing the data to get a desired result.”

And about half the heads in this good-sized auditorium bobbed “Yes”.

These are the people who have been asked to change a date range, to modify a filter, to exclude a segment.  To justify a program by torturing the data into saying what someone wants it to say.

That’s sad.  Really sad.  In fact, it’s downright poisonous to the long-term health of Web Analytics (or any other analytical discipline) as a profession.  It’s a rot from within, difficult to cure.

The existence of “justification” means the business is really not being run by the numbers.  What it means is the business continues to be run by “gut feel”, and the numbers are used to justify on the backend.

That’s not analytics, that’s a lie.

Depending on your experience, if you work in an environment like this, you might want to look elsewhere for a job, because eventually this game collapses.  It has to, you see; other people can and will get the correct numbers.

Especially in Finance.

And speaking of Finance, here’s the second most remarkable thing that happened to me on the Spring Tour.  I was talking with this web analyst who reported into Marketing.  One day, the CFO said to him, “You know what?  I think you should work directly for me.  What do you think of that idea?”

After an initial heart attack, the analyst said OK.  And he is so much happier, giddy in fact.  He loves his job again, really is fired up to get to the desk in the morning.  Why the change?

“The people in Finance get it, they understand what I have to say.  Nobody asks me to fudge the data in Finance, ever.  They just want the truth – good, bad, or otherwise.”

Funny how that works; I know exactly what he means.  If you are a profit-driven Marketer – and you can prove it - the CFO is truly your best friend.  Because a CFO gets Profit Math.

Then Web Analytics Maturity Models started getting a lot of attention due to the release of a new one from webtrends.  So I’m looking through the model and suddenly these two experiences from the Spring tour above pop into my mind.  I think:

You can have the best processes and procedures on the planet, but if you also have this Justification thing going on, if your analytical org chart is designed to fail, these Maturity Models are all just crap.  Literally.  A gigantic waste of time for everybody.

Worse than Scrap Learning.

Now, I’m not picking on webtrends here, because I like their model.  But what I don’t see in this model or the others is anything about properly Managing Analytical Cultures, like these org chart conflicts that drive Justification.  Not sure if this issue belongs in Governance, or Domain Expertise, or some other place.  I know the IT side has established “formulas” for Maturity Models, so maybe this org chart stuff doesn’t belong in the Maturity Model itself.

But this issue of reporting structure needs to at least be addressed in Maturity Model supporting documentation.  What good is it to have all the gears turning properly if the analysis itself is faulty, and drives continued poor decision making?  What kind of Maturity is that?

At some point in the Maturity Model, analysts should no longer report to the people whose work they analyze.  Just think about it; classic fox in the hen house kind of thing.  Analytical people have to be free of the pressure to justify, or you just get chaos (example).

Speaking of chaos, if you’re in web analytics and find out your area has been targeted for downsizing but you would like to stay with the company, here’s an idea.  Head down to Finance and ask them if they would like their own web analytics person.  You might be surprised at the response.  After all, what is it most of the people in Finance do?

That’s right, Analysis.

What do you think about this issue?  Have you ever been forced to Justify?  Are you asked to run reports with “special parameters” for some programs?  To bury or exclude certain reports?

Got any good data torture stories?

Does this organizational topic belong in the WA Maturity Model?  If not, how would you handle it, where does it belong?

 

13 thoughts on “Analyze, Not Justify

  1. Jim,
    you are spot-on. I have lobbied this argument for years at the various companies I have worked at (to greater and lesser success).

    When you are beholden to the people whose product/work you are analyzing (even if just tacitly via pay check) then there is a great amount of pressure on you to have their product/work succeed. And even if you’re obstinate like me, the PHB can and will tell you to change an analysis that does not agree with her/his gut. Finally, even if the PHB doesn’t do this, you are often subtly biased just by being part of the team. If you can run your analytics out of an agnostic group, you are free to let the numbers speak the truth and can have a serious impact on the business.

    Because most companies are not comfortable deploying a full-fledged analytics team with a C-Level sponsor I have often advocating for placing the analytics team within Finance as an acceptable solution that does not require a brand new branch in the org.

  2. Clint, thanks for weighing in. Perhaps now people won’t think I am completely nuts ;)

    Seriously though, with my WAA Education Committee hat on: what really disturbs me about this trend is that analysts might even think it is their JOB to Justify rather than Analyze. Particularly true if the environment is “team-oriented”.

    The effect of tacit or implicit data torture is probably underestimated, to the point where even if a manager *thinks* they are getting the truth, they might not be! I’m pretty sure this effect must have been studied in academia before (Sociology?)

    You’re a braver man than me for directly suggesting analytics report to Finance (I was trying to avoid a lynching); hope I was not too subtle in mentioning this approach as a very good fix. I wouldn’t call it a “movement” yet, but I have talked with several WA folks since the above who are very, very happy to be reporting to Finance.

    And, I think this means the particular companies involved are taking WA very seriously: WA “matters” enough to report to Finance, where the rest of the “Truth Analysis” is happening.

    Can’t get WA respect, nobody will listen? Try talking to Finance.

  3. Jim, brave? doubtful – just one of my (many) buttons. I can’t help but speak out on this topic whenever (and however) it comes up. I don’t think it’s brave when you can’t keep from doing it ;~)

    I don’t think you were too subtle, but maybe I’m overly sensitized to the subject.

    It probably has been studied, I wonder what game theory/prisoner’s dilemma, etc it would fall under? Maybe we should just call it Stockholm Syndrome :-D?

    Jocularity aside, most orgs just aren’t prepared to be analytically-oriented so they think justification IS analysis (you’re call was way scary though) and it’s a huge risk.

    Maybe what’s needed in WA training is some more indoctrination in a scientific/analytic framework? I suspect that many don’t know how to preform proper research and analysis much less frame a solid hypothesis to test rigorously.

  4. Justification IS analysis? – Well, that’s exactly what I’m afraid of, and I’m also pretty sure it’s (typically) not really the analyst’s fault. In other words, there’s not as much explicit pressure as there is “tacit collusion” going on. With so many people in WA not coming out of traditional analytics, it may be this kind of collusive behavior is just expected when you are part of the team.

    And that’s not good.

    Re: WA training, we actually do cover a lot of these analytical ethics issues, including an exercise in decision making when faced with an ethical challenge, in WAA Course 4 “Analytical Culture”.

    So we’re trying. Vox Clamantis in Deserto, perhaps, and this issue needs to be elevated somehow.

  5. Great post Jim.

    In fact, it sounds genius to report to Finance. Wearing my consultant hat, we are always trying to sell to marketing; and I also see the transition from IT to Marketing when it comes to vendors. So which vendor will make the move to market WA to Finance? Coremetrics sounds like a fit…

    Anyway, thanks.

  6. Thanks for the comment Daniel.

    Here’s something that has always perplexed me – why do so many web analysts measure success in terms of Sales?

    When I ask them, they typically say, “Because that’s what my (Marketing) boss wants to know. I understand very well that Profits are the best measure of success, but look, my boss wants me to analyze Demand, and that’s what I do”.

    Two things happening here:

    1. Tacit collusion – Recognition that something is not right, but lack of awareness or skills – not analytical, but cultural – to address the issue

    2. Since Finance would LOVE to see all the Marketing analysis based on Profit instead of Demand, this analyst has a home to go to

    When you think about a lot of what goes on in WA, the endless drilling down into smaller and smaller issues, the effort expended on micro-measurement of extremely low impact events, one has to wonder: How is it possible hardly anyone addresses the macro issue of true campaign profit?

    Campaign lift (not Repsonse), revenue minus all costs (including product cost, for heaven’s sake), net incremental gain – one of the simplest Financial formulas around. How is it possible nobody measures campaign success by profit?

    Stumbling over dollars to pick up pennies, seems to me.

    And how could this happen unless:

    1. Nobody cares, which I find difficult to believe, or

    2. There are collusive forces at work driving justification; nobody is asking any questions

    Except in Finance, where they know that ROI is based on Profit, not Demand.

  7. Jim,

    In part this reply is an echo, and in part this reply is a re-formulation in my mind.

    To me, the whole point of analysis is to make the firm money. Money is the reason why firms exist. If an analyst was ever to lose sight of that, they fail, if not in the short run then certainly in the long run.

    Does justification make the firm money?

    I don’t think it does. The right question to ask of a data set is “what happened, why, and what can we do to make more money?”. The wrong question is “how can you make the data prove how brilliant we are?”

    Torturing is so often involved because people rarely knock it right out of the park on their first go. Why we continue to insist that all our efforts are infallible from the get-go – I don’t know.

    I applaud your standards.

  8. I don’t know why we insist people “knock it right out of the park on their first go” either, but if that exists, it’s a faulty analytical culture. In a proper culture, all the people (including management) understand that Failure is a Learning Experience. That’s why you test – so you fail small, if you fail. And everybody learns from it.

    More on this whole analytical culture issue here:

    http://blog.jimnovo.com/fear_analytics/

    As to “standards”, I don’t know if it’s that. I just hate spending time on a project that’s a “lie”, just seems like a complete waste of my time because nobody is going to Learn anything and nothing is going to get better. That’s what analysts do, in my mind – make things better. Not necessarily perfect, but honestly, financially, better for the company.

    And perhaps that’s a goal more aligned with Finance than Marketing?

  9. OK, I’ll to keep it short. I got more questions than answers.

    It’s been widely studied that when knowledge has an agenda, it becomes ideology, i.e. a view of the world that reinforces/justifies a political position, class advantages, a world view, etc. There’s also a vast body of sociological work that has studied how scientific (i.e. evidence based) knowledge is produced, bringing to light that pure objectivity can be tainted.

    Now, in a business context, it is no surprise that people, and especially “teams”, can have similar agenda, here in its explicit and *tacit* sense. The latter is very important, and it refers Jim to your “tacit or implicit data torture”. This agenda, sociology teaches us, is not always conscious. We know however, that it can very well be too, hence the term “politics” when referring to the conflicts/struggles between the various departments/units of a business.

    Now, we should ask ourselves why is it that some teams (oh boy, and so often Marketing) want to justify instead of truly analyze. Is it because of the reward system? Is it because a desire to reinforce a weak position? If the Accounting logic/view of the world is the ultimate one, the one that *truly* describes a business, why is it that other discourses, sometimes contradictory, can co-exist in the same organization? In short, why would Finance let Marketing uses soft knowledge, “gut feeling”? It almost demonstrates a patronizing attitude towards Marketing that I am sure more than one CMO has felt.

    I guess justification is at its highest in a company where measurement in general is punitive, a form of coercion.You learn that having bad news, which would report on your poor decision making, is highly risky. A learning business, as you pointed out, will tolerate mistakes as long as they are within limits and especially part of a learning/optimisation/improvement process.

    This begs the question of what a learning culture is versus a “non-learning” one. This is an entire whole discussion, and here sociological models could bring some light.

  10. Thanks for weighing in, Jacques. I thought perhaps I could attract your attention with the Sociology reference…

    Here’s the thing. This is not some kind of dream I am having, I have worked in and for companies that have a proper analytical culture, and it’s a beautiful thing to behold.

    These cultures are not harsh, “by the numbers or else” kind of places. People don’t get shot for creating a poorly performing campaign or idea.

    There’s balance, and recognition that great ideas sometimes fail, because ultimately Marketing measurement is about Human behavior and it’s devilishly difficult to be always right about that.

    In these companies, it’s not just Marketing that’s measured, *everybody* is measured. And the interesting thing that happens in a culture like that is you begin to see the synchronicity, you see cross-silo problems where one fix improves the numbers in several operating areas at the same time.

    These companies simply spend a lot more time discussing business facts rather than arguing opinons, which actually leads to a more civil interchange between the silos. Everybody has common, *non-conflcting*, measurable goals and wants to reach them together.

    Sometimes I wonder if analytical cultures are dying off because the people involved simply don’t care. Given analytical cultures tend to have a longer-term orientation, is it possible people are simply churning jobs too frequently to give a crap about continuous improvement?

    Thoughts anyone?

  11. Well, if we go with the simple definition, a culture is a shared system of values, world views, and ethics. If people do not stay long enough to be “cultured” (in the anthropological sense), to be part of building a long-term looking culture such as the analytical one, then yes, I guess people are after quick wins, pocket the bonus, and go to the next gig. Easy to fudge numbers with such state of mind…

  12. Then Jacques, we are left with this: the analytical culture has to be created from the top (C-Level) and then spread and incentivized (HR).

    But for a C-Level person to want engage with this idea of creating an analytical culture, they have to know what the benfits will be (Competing on Analytics). Perhaps more importantly though, they would have the question: How do I create / implement / sustain the analytical culture, which probably requires some incentives for employee loyalty.

    Right? Is that a “framework”? :0

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.