Category Archives: DataBase Marketing

Measuring dis-Engagement

Engagement Matters – Until it Ends.  Right?

Here’s something that continues to puzzle me about all the efforts around measuring Engagement and using these results as a business metric or model of online behavior.

If Engagement is so important to evaluate – and it can be, depending on how you define it – then doesn’t the termination of Engagement also have to be important?  If you desire to create Engagement, shouldn’t you also care about why / how it fails or ends? And if the end is important, what about how long Engagement lasts as a “quality” metric?

Seems logical the end of Engagement might matter.  Let’s call it dis-Engagement.  Simple concept really: of the visitors / customers that are Engaged today (however you define Engagement), what percent of them are still Engaged a week later?  3 months or 1 year later?

Whatever dis-Engagement metric you decide to use, a standard measurement would create an even playing field for evaluating the quality of Engagement you create.  From there, a business could invest in approaches producing the most durable outcome.

Since Engagement is almost always defined as an interaction of some kind, tracking dis-Engagement could be standardized using metrics rooted in human behavior.  Recency is one of the best metrics for an idea like this because it’s universal, easy to understand, and can be mapped across sources like products and campaigns.  Recency is also predictive; it provides comparative likelihoods, e.g. this segment is likely more engaged than that one.

Plus, using Recency would align online customer measurement with offline tools and practices.  This could have implications for ideas like defining “current channel”, e.g. customer is now engaged with this channel, has dis-engaged from that channel.

Taking this path brings up a couple of other related ideas, in line with the discussion around customer journey and entwined with the whole customer experience movement.

Peak Engagement

Let’s say there is Engagement, and because we’re now measuring dis-Engagement, we see Engagement end.  So, is Engagement a one-shot state of being, meaning the value should be measured as such?  Or, does longer lasting Engagement have value, and if so, what about when it ends? Shouldn’t we want to find the cause of dis-Engagement?

Continue reading Measuring dis-Engagement

Marketing Responsible for Customer Experience?

 The Data

According to this survey, Marketers are not now really “responsible”  for the customer experience (whatever responsible means in this context) but will be over the next 3 years.  If it was just the vendor (Marketo) trumpeting this idea, I’d be more skeptical.  But this vendor hired the Intelligence Unit from The Economist organization to do this work and the report includes the actual questions, meaning you can check for bias.  Population is 478 CMO’s and senior marketing executives worldwide, seems decent / not cherry-picked.

So I will cut the vendor some slack.   Questions though, right?  Just what is customer experience, in particular for the purposes of success measurement?  How does it fit with related ideas like Customer Journey / LifeCycle and Engagement?  Certainly if the above is a significant macro trend we ought to sort this all out first?  And of course, putting some analytical rigor (structure, process, and definitions?) in place to support the effort ;)

The Story

I know a lot of marketing people who have either had this authority for years (multi-channel database marketing) or are moving in this direction, so the results make sense to me.  To be clear(er), “experience” for these people reaches all the way back from UX into fulfillment and service.  So when they talk about experience, they are talking visitor and customer; not just navigation and landing pages, but also shipping times and return rates.

Perhaps increased access to customer data is revealing the significant impact customer experience in this larger sense has on long-term customer value?  This idea, coupled with increased focus on accountability (also covered in the survey) could be driving this trend.

Worth the read, only 20 pages long with a lot of charts.  Here’s 4 snippets to hook you:

Continue reading Marketing Responsible for Customer Experience?

Do NPS / CES Feedback Metrics Predict Retention? Depends…

Survey Says?

Several questions came in on the ability of surveys to predict actual behavior, covered in the post Measuring the $$ Value of Customer Experience (see 2. Data with Surveys). My advice is this: if you are interested in taking action on survey results, make sure to survey specific visitors / people with known behavior if possible, then track subjects over time to see if there is a linkage between survey response and actual behavior.  You should do this at least the first time out for any new type of survey you launch.

Why?  Many times, you will find segments don’t behave as they say they will.  In fact, I have seen quite a few cases where people do the opposite of what was implied from the survey.  This happens particularly frequently with best customers – the specific people you most want to please with modifications to product or process.   So this is important stuff.

You’ve Got Data!

Turns out there’s a new academic (meaning no ax to grind) research study out addressing this area, and it’s especially interesting because the topic of study is ability of customer feedback metrics to predict customer retention.  You know, Net Promoter Score, Customer Effort Score and so forth, as well as standard customer satisfaction efforts like top-2-box.

The authors find the ability of any of one of these metrics to predict customer retention varies dramatically by industry.  In other words, you might want to verify the approach / metric you are using by tying survey response to actual retention behavior over time.

Continue reading Do NPS / CES Feedback Metrics Predict Retention? Depends…