The news that Omniture has acquired Touch Clarity is shaking up the world of web analytics a bit. Machine automation has always been a very sexy sell for software companies. The problem is people think it’s a magic bullet and often end up using these tools to their disadvantage because they do not have the experience to really understand how to use the tools properly. Then they get caught in trap of Reporting versus Analysis.
Here is a real world example from the Lab Store. I am constantly fighting the Google AdWords A/B split testing algorithm for rotating ads. Google almost always picks the wrong ad to run more frequently so I have to force it to run 50 / 50 in order to get accurate results. How do I know Google is picking the wrong ad? Because I have seen thousands of such tests, online and off, and I have a “feel’ for these things based on my background in Database Marketing, Consumer Behavior and Psychology. In each case where Google has picked one ad over another, and where I have forced it to then run the ads 50 / 50, it ends up I was right – Google picked the ad that generated the least profit per dollar of PPC spend as “best” and demoted the more profitable ad until it was not running at all.
Why does this happen? Because Google isn’t smart enough to understand the complexity of the customer behavior in the Lab Store – and it can’t be, given the number of clients it has. If you have done a lot of this kind of testing, you know that often the campaign with the highest response rate generates the lowest quality customers. While these campaigns were running, I could see that the visitors generated by the campaigns Google picked as “best” were actually inferior to the visitors generated by the campaigns Google demoted, using a variety of metrics other than conversion (primarily Recency). In other words, I was able to predict Google was doing the wrong thing by looking at the Customer LifeCycle. When I forced Google to run the ads 50 / 50 to give the demoted ads a chance, I was proven right – the campaigns Google demoted had a 90-day ROMI averaging 2.1 times higher than the campaigns Google promoted.
Look, I know these are software companies and their sole purpose in life is to create the next big thing and sell their software into it. That’s fine, and frankly, I hope they are successful in doing it, because it will create a tremendous amount of business down the road for database marketing consultants as “machine optimization” hits the wall and companies need to be rescued from the results of it. Just like they had to be rescued from demographic clustering in the 80’s and data mining in the 90’s.
People are always looking for the easy way out, and it ends up costing them more in the long run because they don’t really understand what the tool does and does not do. Perhaps that is simply the state of Marketing today. So be it…
If you are an analyst and you see a black-box test result that simply does not make any sense based on your past experience, I encourage you to question the result, find a way to test it outside the system. Learn why, because this kind of incident usually will lead to a shattering of some myth or bias you will be most happy to fully understand.