Tortured Data – and Analysts

Fear and Loathing in WA

You may recall I wrote last year about the explicit or implicit pressure put on Analysts to “torture the data” into analysis with a favorable outcome.  In a piece called Analyze, Not Justify, I described how by my count, about 50% or so of the analysts in a large conference room admitted to receiving this kind of pressure at one time or another.

Since then, I have been on somewhat of a personal mission to try to unearth more about this situation.  And it seems like the problem is getting worse, not better.

I have a theory about why this situation might be worsening.

Companies that were early to adopt web analytics were likely to already have a proper analytical culture.  You can’t put pressure on an analyst to torture data  in a company with this kind of culture – the analyst simply will not sit still for it.  The incident will be reported to senior management, and the source of “pressure” fired.  That’s all there is to it.

However, what we could be seeing now is this: as #measure adoption expands, we find the tools in more companies lacking a proper analytical culture, so the incidents of pressure to torture begin to expand.  And not just pressure to torture, but pressure to conceal, as I heard from several web analysts recently.

One bright young analyst went “beyond the call of duty” on his analytical project.  The analyst gathered relevant data not just from the WA tool, but from Finance, Customer Service – all around the company.  The report painted a detailed picture of cost to acquire customers through various methods and campaigns, and was presented to the head of Marketing – also the analyst’s boss.

The analyst was told under no circumstances was this report to ever be produced again.  Further, the analyst was told to destroy any “evidence” this project / report ever existed.  And finally, the analyst would now be required to send all analysis through the boss first before anybody else sees it.

That’s shameful behavior for an exec.  And apparently, this kind of thing is happening more and more often.  I’ve heard plenty of “if we want your opinion, we’ll ask for it” stories, but this is the first time I’ve heard so many stories about concealing results.

Here’s a scary thought: what if the stories about web analytics not driving business value are primarily concealment stories?   What if the tool / analysts actually did provide value, which was then hidden from Senior Management?

My concern about this issue is wider than screwed up company culture and management.  What I’m more concerned about is screwed up people, analysts who may come to think this kind of behavior is normal and just part of being an analyst.

This matters because as this new generation of analysts moves to other companies and throughout the ecosystem, these pressure to torture situations could become “accepted” and even spread as “part of the game”.

It is never, ever OK to manipulate or hide the results of an analysis.  It’s not part of the job.  The role of an analyst is to analyze, not justify or conceal bad news.

Now, I realize some folks are thinking, “Yea, that’s great Jim, I’ll just get myself fired by being an analytical hero”.

I’m not saying you should respond to data torture pressure by falling on your analytical sword.  What I am saying is you – and management – need to know this kind of pressure from a superior is shameful, not a “normal” part of being an analyst.  And as soon as you can, you should get a job somewhere people respect your professional opinions.  Don’t have to agree; but must respect.

Like the company you work for?  Ask a buddy in Finance if they could use a web analyst.  Pretty sure Finance would be interested in fully-loaded cost to acquire new customers by source!

What really troubles me about this situation is it’s rarely ever talked about, so could be worse than people might think.  At the very least, Senior Management should know about the potential for this to happen and lay down some rules.  Perhaps even seek some cultural guidance on this topic (here’s a start – Fear of Analytics).

So, I want to put this message out there, perhaps create a resource for people who are looking for information on this topic.  It would be great to have examples so managers can understand and be on the lookout for these situations.  Plus, I’m sure there are some terrific stories out there about either giving in to the torture pressure or resisting it!

What about you?  Were you ever pressured to torture the data?  What happened?  Did you comply?  How did things come out?  Tell us with a Comment.  Feel free to post anonymously, leave out company names.

17 thoughts on “Tortured Data – and Analysts

  1. Jim,

    Great article and really rings home to several stories I have heard in this part of the industry that I now work in (TagMan) and those on my previous life Agency side.

    The most pertinent that I would love to share is that of the analyst who KNEW that deploying your analytics tags at the top of a page (we all know this now) shows a beneficial X% difference in reported traffic/results – than if you put at bottom of page (we have our own latency based study on TagMan.com).

    He presented these details to the management team of his publicly listed company who promptly told him to leave it be; as the difference in reported traffic to actual results made their public-facing business reports look all the better; therefore an inflated value rather than actual.

  2. Hi Chris – that’s a great story, thanks!

    It’s a bit different than I was thinking of – in your case, at least management was presented with facts and *they* chose to ignore them. I’m actually OK with that – it’s management’s show, and the analyst retains integrity.

    Much more concerned about Sr Management *never hearing* the real story because of pressure to torture / conceal data from folks below them.

  3. Hi, Jim – been a fan of your work for years – glad someone is finally willing to expose this problem.

    Briefly, I was engaged to bring accountability and optimization methodology to a very large organization’s branded internet marketing effort. In a nutshell, after facing many months of passive/aggressive behavior, I was politely told by the sponsoring, internal service agency NOT to advocate ROI as the final determinant of campaign effectiveness as it was considered “unpopular” to be that accountable.

    This was in direct contradiction to the brand manager’s public requests
    on a spend of $5mm annually.

    Silly me, in delivering what the ultimate business client needed, the internal service agency felt their control was challenged and as a result my contract with them was terminated.

    I guess developing a clear ROI business case was not in the best interests of those who have historically survived by being unaccountable, but politically expedient.

  4. STM, Love your “Name”, thanks for the comment. Others, feel free to comment as “STM”…

    Imagine, “unpopular to be that accountable”. Of course, this speaks to the larger accountability movement in Marketing, which gains a bit of steam every day, fortunately…

    I often find that “forcing a hypothesis” can help with these kinds of situations. If not ROI, then what? Why are we doing this, what is the benefit? There must be *some* benefit to a spend of $5 million, right? This approach often creates some mild tension that can result in a stake in the ground. “The benefit of the campaign is web site visits”.

    OK, if it’s “visits”, how many? 500,000? OK. Does a 1 page, 2 second visit count? No? How about only visits with at least 2 pages? And so forth.

    If ROI can’t be the game, and you can get people to put a stake in the sand for some benefit, then at least you have something to analyze against and try to improve.

    Thanks for the story!

  5. Great post, Jim.
    Few weeks ago, I solved that problem for myself: After four long (too long) years of hearing exactly what you wrote: “If we want your opinion…” and concealing results from our clients, I handed in my resignation.
    Not that move was sudden. I protested as much as I could over the years (and sold my silence too many times, not something that I’m proud of). The usual excuse was: “That’s what gives you your paycheck”. There were things that I explicitly didn’t do, such as tampering with control groups, but I did conceal results. Despite my (very loud) protest, nothing changed, even when clients left us for the competition. Eventualy, I couldn’t handle it any longer.

  6. AFD, thanks for the comment, and I admire your honesty. “Selling Your Silence” is a fantastic term that’s more to the point than the “torture” riff, and perhaps I’ll use that going forward instead (if it’s OK with you!).

    I think the important point is you knew what was going on, and dealt with it in your own way; no shame there! Overt “paycheck” threats are a particularly horrible practice that cell phone recording might put away…

    People need to understand that being an analyst by definition brings a certain amount of conflict, and *especially so* if you’re good at it. Deal with it as you must for where you are in your career and responsibilities, just know that selling your silence is not “part of the game”.

    And, there are companies out there who respect the role of the analyst, though ad agencies seem to have a bit of a problem with it, according to response received so far. That’s not surprising, is it?

    Any agencies out there want to hire an honest analyst like AFD?

  7. Thanks for your concern, Jim
    It seems that there are some positions for experienced analysts, not only in the WA area. I got some feedbacks and hope to find a new job very soon.

  8. Jim,

    Another great post. It is crazy to see all of the comments and realize that we are not alone in our struggles. This is always comforting to me. Thank you for providing a place that we can voice our concerns and know that somewhere, someone else is in the same boat.

  9. Jim,

    This is a great discussion you’ve started–sorry if I’m a little late on the draw. First, I’m a huge fan of your blog, your book, and your work, but I must admit to being surprised at, well, your surprise.

    Maybe you haven’t worked for very many large companies?

    This sort of thing is positively endemic, and not just in web analytics, but in all aspects of Corporate America, but especially management consulting and Wall Street. At my last job I worked for a very sharp former McKinsey consultant and Harvard MBA, who was new to web analytics, but had many years of data analysis and presentation experience. He was surprised (and delighted) to find that I would actually look at the data first, develop hypotheses, test them, and then prepare presentations with the insights and recommendations.

    Why was he surprised? Because he’d never known anyone who actually had done this before. In his experience, the process involved an “understanding” with the client…He’d start with the decision that the client wanted to make, and then “torture” the data to find evidence to justify the decision so that the client could sell this decision internally to employees and externally to the media (and to Wall Street of course). Spin 101.

    Speaking of Wall Street, and Wall Street analysts, they (for the most part) rarely do what you would call “real” analysis; instead the analysis is the marketing.

    As much as it pains me to say it, in the real world, VERY few large organizations have a true analytical culture, and the higher you go up the org chart (especially if it’s a publicly traded company), IME the less likely any real analysis is done and the more everything becomes a matter of appearances rather than substance.

    So I think you’re really touching on a subject that is much bigger than web analytics. It’s about organizational behavior, power, control, and spin.

  10. NotMyRealName, I’m shocked that you’re shocked that I’m shocked.

    I have worked at some big companies, both as an employee (HSN is a $2 billion operation) and a consultant (Verizon is kinda big) and I have seen the analytical culture work and not work, sometimes in the same company!

    HSN was basically a data fist fight – EVERYBODY had analysis so disagreements were often decided by methodology; it was truly a data-driven culture. This is what happens when a minute by minute sales and profits dashboard is on every computer Director level and above, I guess…

    What really bothers me in terms of web analytics is this: people will sit around and say they are “not getting any value out of their web analytics investment”. And that could be true = weak analytical talent.

    Or it could be that they’re not getting the real story out of their WA investment because everybody is so busy torturing the data to cover their arses that no action is ever taken.

    And I just wonder, what % is the former, what % is the latter?

    I guess we can only hope that Darwin will take over and the companies who are truly “Competing on Analytics” will survive and the data torturers will not!

  11. Such a great conversation. Sorry I’m late to the thread.

    We see this happen in the paid search world as well. We often lose potential business to competitors because when we look at their existing program we see many opportunities to save money but very few to generate more sales by spending more. Not the message they want to hear so they hire someone who says they can press the gas pedal that’s already on the floor to some point below the floor boards.

    We also see middle-managers hired into our client’s firm based on the premise that they will double or triple the size of the search program we’re running and do so cost effectively. This usually involves first asking us to do stupid things that cost them huge amounts of money at horrendous return on investment. Second asking us to hide the stupidity with subterfuge, then firing us since we weren’t able to execute what they wanted. Usually we find these folks end up on the street 6 months later, but in the meantime we’ve lost a client…frustrating.

  12. George, it’s great to get perspective on this from yet another different kind of source – one with a performance-based culture.

    I can’t help but wonder if we are going through a transitional period where the business culture of “gut feel” is being replaced by the business culture of “numbers”, and there are a significant number of “resistors” out there who as you said, end up on the street after a time. If they indeed end up out on the street, perhaps this means the culture is actually changing from the top and there is a process underway.

    On the other hand, in the online space the volume of voices supporting some form of mystical gut feel versus actual measurement is so loud – and the resistance to looking at history as a guide so fierce – that I wonder if it can be overcome.

    Thanks for your story!

  13. Unaccountability howls in protest as advertising & conversion metrics get confused.

    Baseball & marketing optimization shared truth: Sum of the parts is NOT greater than the whole. Recent confusion emerges within online metrics community, driven by narrowly defined budget protection interests, fuels attribution debate.

    Back to baseball, what makes Albert Pujols great is not the total number of at bats he gets. It’s the industry peer acknowledged impact on significantly enabling his team’s winning percentage. He’s what they call a “game changer” due to his fairly unique ability to contribute across multiple performance categories: batting, fielding, running, & leadership. An ability derived from a daily regimen devoted to understanding success & willingness to methodically test/learn/adapt to the changing competition.

    With the continued revenue dominance of paid search vs display & even online vs offline effectiveness being greatly debated, every channel seemingly is reacting as if its a zero sum game. As in baseball, if your team doesn’t have an Albert Pujols, then you’ve already at a disadvantage & likely will experience a losing percentage.

    Multi channel attribution modeling is a time consuming & expensive technological undertaking. One that holds all the potential for generating substantial conflict within organizations that have not resolved internal channel revenue recognition.

    Given, a marketer’s most crucial resources are time & money, one can arguably question the value of pursuing the development of a truly accurate multi channel attribution model (especially down to the transactional level) relative to the impact on “winning percentage” derived from maximizing other things within their control.

    For example, if the proponents of display advertising feel last click attribution is inappropriate for branding campaigns, then why not focus on focus on the solution instead of the problem. Focus on optimizing the creatives to placements to landing page assignments as relevant as possible for your targeted segments. This will enable a higher percentage of the gross impressions served be more effective to actually reaching their intended audience.

    Paid search advertising performs more with less simply because it was designed from inception to deliver relevance (clicks) efficiently versus gross tonnage (impressions).
    In all fairness to display advertising, the delivery and thus economic revenue model, historically was never designed with transactional efficiency in mind, but rather for massive market reach similar to print and television.

    This was all acceptable until the game and the audience evolved.

    And, why we have proponents of adapting digital measurement to include gross rating points (GRP’s) methodology in an attempt to hopefully capture offline media spend over to online.

    With the arrival of paid search market reach, continued channel fragmentation including the emergence of social networking/mobile empowered consumers, & an crippling economic recession forcing evaluation of spending effectiveness; the game has fundamentally shifted.

    As in baseball, when the steroid fueled era of inflated home run totals were found to not cost effectively correlate to winning percentage, the industry adjusted to other more significant determinants of overall winning outcomes.

    So, instead of attempting to change the definition of winning percentage or success outcomes, consider we as marketers, vendors, & measurement strategists already possess the tools to make a significant business impact.

    Tools & methodology which also do not require the industry to get side tracked with unproductive, time-consuming debates over the merits of having or not having a good multi channel attribution modeling in place before one can improve their individual internet marketing results.

  14. Jim, check out a very compelling eMarketer survey of CMO’s which revealed: A March 2010 survey by Chief Marketer showed the click remained on top, with 60% of US marketers reporting they measured performance in click-throughs. Fewer than two-fifths measured overall return on investment (ROI).

    “Marketers’ familiarity with clicks is only one factor that contributes to its continued usage as the top metric,” said David Hallerman, senior analyst at eMarketer. “Click are easy to count, too, and therefore an inexpensive metric to gather.

    “In contrast,” Mr. Hallerman said, “measuring either brand effectiveness or the indirect effects of online ads—such as how display ads contribute to search clicks—is more complex and typically costs more to accomplish that just tallying up clicks.”

  15. DSTM, there are so many people in love with “easy”. And you have to hope it’s more than being lazy, for example, they must be “busy”, so “easy” is a good thing.

    Yet I have to wonder how many of these people are basically wasting a lot of their time chasing unproductive ideas and noise. If they were focused on the right ideas, perhaps they would find time to measure what matters, not just what is easy to measure.

    Marketing is one of the few silos in the enterprise that has not come under a lot of direct pressure to be productive. Certainly manufacturing had it’s turn, and the end output of CRM seems to have raised productivity in service.

    If you talk to recruiters, they are starting to see demand for “accountable” marketers in all kinds of orgs, and especially in senior positions. Perhaps the CFO is indeed knocking on marketing’s door.

    If so, perhaps it’s time to look for more than “easy” as the solution.

  16. Great post. We see people manipulating data all the time to match their agenda. Sometimes it is malicious, to put their message out there regardless of what the data says, and sometimes it is just inexperience and a pre-conceived notion of the what the data should say that drives the final result.

    http://www.brightmetrics.com/blog

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.