Category Archives: Analytical Culture

eMetrics “ShootOuts” We’d Like to See

I was in Vancouver for a presentation to CAUCE [kay-yoose, thanks Raquel] and was able to grab a quick dinner with fellow WAA BaseCamp stakeholders Andrea Hadley, Raquel Collins, and Braden HoeppnerWe’re rolling out a new 2-day format for BaseCamp and got to talking about web analytics education in general. 

We started talking audience segmentation and content at the eMetrics Summit, and specifically the “shootout” format from the old days.  You know, 10 vendors on the stage at the same time taking questions from the audience.  Those sessions were both educational and hilarious at the same time, as the vendors side-swiped each other on topics like accuracy, how visitors are counted, cookie structures, and so forth.

But that was back when the technology was in flux, and now that issue has settled down a lot.  Braden brought up the concept of returning the “shootout format”, but more on the business side.  You know, get some practitioners, vendors, and consultants up on stage and have them thrash out stuff like:

1.  Attribution – does it really make sense to even bother with attribution at the impression / click level when there is often not a strong correlation to profit?  I mean, just because someone sees or clicks on an ad does not mean the ad had a positive effect; in fact, it may have had a negative effect.  Why not go straight to action or profit attribution, instead of using creative accounting?

Continue reading eMetrics “ShootOuts” We’d Like to See

Got Discount Proneness?

Discount Proneness is what happens when you “teach” customers to expect discounts.  Over time, they won’t buy unless you send them a discount.  They wait for it, expect it.  Unraveling this behavior is a very painful process you do not want to experience.

The latest shiny object where Coupon Proneness comes into play is the “shopping cart recapture” program.  Mark my words, if it is not happening already, these programs are teaching customers to “Add to Cart” and then abandon it, waiting for an e-mail with a discount to “recapture” this sale – a sale that for many receiving the e-mail, would have taken place anyway. 

The best way to measure this effect is to use a Control Group.

When I hear people talking about programs like this (for example, in the Yahoo analytics group) what I hear is “the faster you send the e-mail, the higher the response rate you get”.

That, my friends, is pretty much a guarantee that a majority of the people receiving that e-mail would have bought anyway.  Hold out a random sample of the population and prove it to yourself.  There is a best, most profitable time to send such an e-mail, and that time will be revealed to you using a controlled test.  The correct timing is almost certainly not within 24 or even 48 hours.

That is, if you care about Profits over Sales, and trust me, somebody at your company does.  They just have not told you yet!

When you give away margin you do not have to give away on a sale, that is a cost.  Unless you are including that cost in your campaign analysis, you are not reflecting the true financial nature of the campaigns you are doing.  If you are an analyst, that’s a problem.

If you are using cart recapture campaigns, please do a controlled test sooner rather than later.  Because once your customers have Discount Proneness, it will be very painful to fix.

Continue reading Got Discount Proneness?

Marketing Science (Journal)

As I said in the Heavy Lifting post, I think the Web Analytics community is becoming increasingly insular and should be paying more attention to what is going on outside the echo chamber in Marketing Measurement.  I also think the next major leaps forward in #wa are likely to come from examining best practices in other areas of Marketing Measurement and figuring out how they apply to the web.

For example, did you even know there is a peer-reviewed journal called Marketing Science, which calls itself “the premier journal focusing on empirical and theoretical quantitative research in marketing”?

Whoa, say what?

This journal is published by the Institute for Operations Research and the Management Sciences, and articles are the work of premiere researchers in visitor and customer behavior from the best known institutions around the world.  In case you didn’t know, “peer-reviewed” means a bunch of these researchers (not including the authors, of course) have to agree that what you say in your article is logical based on the data, and that any testing you carried out adhered to the most stringent protocols – sampling, stats, test construction, all of it.

And, most mind-blowing of all, they show you the actual math right in the article – the data, variables, formulas, graphs – that lead to the conclusions they formulate in the studies.  You know, like this:

Continue reading Marketing Science (Journal)