“About the Blog” as a Post

I had a request to publish my “About the Blog” page as a post so people could comment on it.  Here ya go Jacques.

From the Drilling Down newsletter, 12/2004:

What is the number one characteristic shared by companies who are successful in turning customer data into profits?  The company fosters and supports an analytical culture.

Web analytics and Pay-per-Click Marketing in particular have served to teach many people the basics of applying the scientific method to customer data and marketing – creating actionable reporting, tracking source to outcome, KPI’s, iterative testing, etc.  The web has allowed companies to dip a toe into the acting-on-marketing-data waters at relatively low cost and risk when compared with offline projects.  And many have seen incredible ROI.

I think web analytics could be poised in the future to serve a greater role – teaching people / companies the optimal culture for success using analytics, also at relatively low cost and risk.  It’s going to be much harder to drive this concept but more rewarding if as users we can make this happen, because today’s web analysts (and maybe analytical apps) could potentially be among tomorrow’s leaders in a data-based, analytics-driven business world.

For example, do you think analyzing / understanding new interactive data streams where the interface is not a browser will be any different, in terms of the culture required to turn interactive customer data into profitable business actions?  I don’t.

Look, a “request” is a request, whether a click, IP phone call connect, cable TV remote button push, verbal command, card swipe, RFID scan, etc.  You’re still asking a computer to do something.  The request has a source, is part of a sequence (path), and has an outcome. 

Analysis of these requests will face challenges and provide potential benefits similar to those provided right now in web site analytics.  This is the beginning of analyzing the interaction of computers, people, and process.  

Without a doubt, no matter what form these requests take, there will be a “log” of some kind to be analyzed.  Usability?  Conversion?  ROI?  These issues are not going to go away, and companies need to develop a culture that properly embraces analyzing and addressing them.  Companies not developing this culture will find themselves continuing to bump along the “drowning in data” road and will never optimize their interactive customer marketing.

As I see it, here’s the “culture” issue in a nutshell: as a company, you have to want to dig into data and really understand your business.  This pre-supposes that you (as a company) believe that understanding the guts of your business through analytics will drive actions that increase profits.  If the company doesn’t generally support this idea, there is no incentive for anyone to pursue it and the company just happily bumps down the road.

Of course most people don’t really relate to the “company”, but their own division or functional silo.  So you might have manufacturing / engineering groups who live and die through analytics but marketing is not held to the same standards and thought processes.  This is where the idea of Six Sigma Marketing comes in, it’s a “bridge” of sorts that tries to say (perhaps to the CEO and CFO), “Hey folks, if the engineers can engage in continuous improvement through ongoing analytics, so can the Customer Service silo and the Marketing silo and perhaps others.”

At a higher conceptual level, analytical culture takes root when management makes it known they are not afraid of failure, and want employees not to be afraid of it either.  

Another way to say this is experimentation and testing are encouraged throughout the company.  Failure is a regular occurrence, and is even celebrated because through failure, learning takes place.  Show me a company with no failures or that hides failure and I’ll show you a company that is asleep at the switch, afraid of its shadow, a company soon to be irrelevant to the market it serves.

Hand in hand with accepting failure must be continuous improvement.  Even though failure is embraced as a learning tool, the lesson of the failure both prevents it from happening again and results in new ideas with a higher potential for success.  These twin ideas of embracing failure / continuous improvement are at the heart of every business successful in using analytics to improve profitability.

“Evidence” of a company with the right bones to grow an analytical culture is this: you see the various levels of employees working in cross-functional teams with a common problem-solving mission.  Instead of people in a silo groaning about members from other silos being present at a problem-solving meeting, people are instead asking, “Where is finance, where is customer service?”

The most common place “analytics” live in a company is in Finance with the “Financial Analysts”, who are mostly tasked with analysis related to financial controls and producing financial reports.  If marketing or customer service was willing to expose themselves to the rigor of these analysts, they would undoubtedly be able to improve their business areas.  But that exposure takes substantial guts and confidence in your abilities, and a “culture” that supports a scientific process.

And you can’t engage in this process without analytics; success and failure need to be defined and measured.  The easiest way to encourage this culture to take root is to team a department head with a Financial Analyst familiar with the area.  

Often, you find this finance person already has insightful questions that could lead to improvement, but “never asked” because “it’s not my job”.  And often, to make changes in a business today, you need IT support of some kind.  That’s the basic cross-functional unit – Finance rep, IT rep, and a department head.  

I would also argue that if Marketing has a seat at the table in the strategic, “Voice of the Customer” sense (as opposed to being relegated to Advertising, PR, and Creative), then marketing is part of the core unit.  Then you add other disciplines as needed based on the particular problem you are trying to solve.

If the culture is flexible enough, this can turn into “Business SWAT” where the best and brightest cross-functional teams roam through the company as “consultants”, tackling the hardest business problems, which (surprise) are usually cross-functional in nature.  And “blame” is never on the agenda, it’s about “how can we help you make it better?”  You need a culture that is clear about this idea in order for people to expose themselves to the analytics-based scientific process.  Success and failure are defined by the analytics.

If you think about it, web site management ruled by analytics is a microcosm of this Business SWAT set-up.  You have marketing, finance (ROI component), and technology all working together based on the data.  That’s why I think there is a higher mission for the web analytics area / people; they are building the prototype that can teach companies how to go about measuring, managing, and maximizing a data-driven business.

At the highest level of this culture, managers “demand” these SWAT teams because the success rate and business impact is so high.  As the various departments or functional silos produce wins and losses, capital (budget) flows to where the successes are and away from the failures.  When managers see this happening, they jump on board, because they want the budget flowing their way.  This creates a natural economic supply and demand scheme with a reward system for participation built into the process.

One caution: when the culture gets to this level, the analytics group must be sanitized from the reporting hierarchy.  It can’t report to finance, or marketing, or IT anymore.  It has to be completely independent, which usually means reporting directly to the CEO.  There has to be confidence in the integrity of the results of all testing based on standards.  All the little “pools” of analytical work throughout the company must be gathered into one.

What kind of companies do you see really engaging in this kind of culture right now? Those that for legacy reasons have always had access to their operational and customer data and have been using analytics for years.  For these legacy players, web analytics is a “duh” effort – they get it right out of the box, because it’s more of the same to them.  But many types of businesses have not had this access to data before and web analytics is the first taste they are getting of the power and leverage in the scientific method.  I think this “accountability” disease we’ve created in web analytics and search marketing will continue to spread and infect every business unit.

The longer-term question is, can we flip this model over, can the successful culture of cross-functional approach and continuous improvement used in web analytics be used to create a “duh” moment for other areas of the company?  Will “best practices” and success stories create an environment where people say to the (web?) analytics team, “Hey, can I get some of that over here?”  In other words, will the analytical culture develop?

Methinks there is more going on with web analytics than meets the eye; it’s potentially a platform for the creation of a new business culture, a culture based on the scientific method – Six Sigma Everything.  Sure, it’s awkward and maybe the web is not meaningful enough yet to many companies.  But as we thrash all this out, there is something greater being learned here.

Right now, many CRM projects can’t show ROI because nobody knows what to do with the data, how to turn it into action that improves the business.  Sounds very much like web analytics 5 years ago…and look what we talk about now.  KPI’s, turning data into action.  The analytical culture playing out.

What does this mean for the people currently involved in web analytics?  If I was a young web analytics jockey, I would be preparing for the spread of the analytical culture, and seriously thinking about learning some of the tools traditionally used in offline analytics – the query stuff like Crystal Reports, the higher end stuff like SAS, SPSS, and so on.  Search the web for “CHAID” and “CART” and see if you like what you read about these analytical models.  If this kind of stuff interests you, you are much closer to being a business analyst than you think.  And guess what?  Analysts who can both develop the business case and create the metrics and methods for analysis – like you have to do for a web site – are rare.

It takes a particular mind set, and that mind set is not common.  Most of the people with the right mind set go into the hard sciences, but demand on the soft side of business (marketing, customer service, etc.) is just beginning in our data-driven world.  

On the hard side, (with all apologies to the real engineers out there for the exaggeration) the drug works or it doesn’t, the part fits or it doesn’t.  The development of softer-side marketing and service analytical techniques is always going to be populated with a lot more gray area than there is on the hard side, and it takes a special skill to conceive of and develop the metrics required.  But we should be trying to bring the same analytical rigor to the soft side of business that the hard side has always had to deal with.  The trick is to apply that rigor without damaging the mission.

For example, the whole “fire your unprofitable customers” thing from some factions in CRM.  That’s ridiculous.  What you want to do is identify them and then act appropriately, whether that means controlling their behavior, not spending additional resources on them, or not doing the things that create them in the first place.  That’s the gray showing.  You don’t just hit the “reject button” on a customer.

Customer data is customer data.  It’s all going to end up in one place eventually as the analytical culture spreads, and those with the skills to apply the scientific method across every customer data set are going to be rare and in very high demand.  Don’t spend all your spare time watching the Forensic Files on Court TV.  You’re a business analyst.  Get out there and learn the rest of your craft!

And, please consider doing whatever you can, whenever you can, to spread the analytical culture within your company.  If most of what your analytics involve is “online marketing”, reach out to “offline service” or another silo and ask if you can help them with anything.  What’s the call they would like to take less of, can you use the web site to make that happen – and prove that it worked?  Can you use the web site to generate offline ROI?  

Web analysts, you are the cross-functional prototype.  Please teach others how to optimize the entire business.

Share:  twittergoogle_plusredditlinkedintumblrmail


Follow:  twitterlinkedinrss

7 thoughts on ““About the Blog” as a Post

  1. Hi Jim,

    First, I think this About of yours is really well positioned to become the best first shot at what should be our framework as a field (Web Analytics). To often the discussions focus on applications, probably a side effect of more and more money being at stake. It is pretty clear now that operationalizing Web Analytics is crucial, and I salute efforts such as Eric Peterson’s (WA Business Process) in this regard. Personally, I have been hammering lately to my clients that at least 50% of their chances of success reside on the operation side.

    Obviously, you are a proponent of the “analytics-driven business world”, as any avid reader of “Drilling Down” (book/newsletter) knows. In my case, I came to realize lately that the coming year, year and a half, will witness the impact of database marketing framework in web management, and Web Analytics.

    This being said, I find quite original the fact that you are saying WA is going to be the virus by which many companies will be infected by the analytics culture. The arrival of Google Analytics is a blessing! It’s early web pre-Norton! Everybody is gonna get it. But will they “get it”?

    This is one place where I have found a lot of resistance. Your call to the right of failure is at the same time very true, and a taboo. It is still even very hard for many managers I have met to simply “smell the coffee”, that this or that just plain doesn’t work; face it, get rid of it, and try to find something better. In their heart of hearts, they know they should be measuring, that if you can’t measure you can’t manage, etc., but they still believe in the “build it and they will come” myth.

    It is quite understandable. Who can go to his or her boss with a large smile and say “Boy! Did we miss the target on this one!”? Most businesses value only success… all the time. Ever saw a responsability ducking meeting? Companies should learn to encourage some degree of failure, but of “controlled” failure. “Measured” as you say. I mean, isn’t trial/error at the heart of invention/innovation? “I can can tell you 10,000 ways how not to make a lightbulb” kind of attitude? The very engine of value creation?

    How can then this new analytics culture be instilled? How can companies who haven’t yet been playing in the data sandbox realized such change? How can we accept some degree of monitored failure, become a learning organization, and get better one day at a time?

    Your Business SWAT seems to be one excellent way to start executing that vision. I find it quite interesting when you say that, at one point, such “unit” would have to get “sanitized” and not report to anyone but the CEO. Hmmm, there’s a group of people in charge of taking all the data, and optimize the business, which is to say maximize its profits. Well, if this is what that team would do and do well, my question is: how long will it stay ad hoc? How long will it take before people realize that’s where the nice stuff is happening, the action is? Aren’t you hinting at some profound changes here?

    I can’t imagine such a team, with its impact on the business culture, stay for long in the “other-department-service” model before becoming it. THE business itself. I guess this is what you mean by “analytics-driven business”.

    And Web Analytics is going to be the most important change factor? Man! Did I do well to jump ship 5 years ago !

    But, Jim, one last question: if you say that a company needs to have faith in analytics before changing its culture, where will that faith come from? Really, Web Analytics? Acceptance of a true trial/error system? Definitely not Wall Street I suppose.

  2. Well said, Jacques. Many questions, not so many answers. This is the quest I am on.

    As a background, let me tell you, I have seen these kinds of companies in action. Offline, they tend to be traditional direct marketing or database marketing companies – catalogs, some financial services outfits (Capital One comes to mind), some publishers. Online, they tend to be the smaller pure plays where someone there (likely CEO or CFO) came from an offline database marketing background, or large enterprises that are mostly web-based.  And of course, agencies that specialize in this kind of work.

    These corporate cultures exist. How did they get that way? Basically, they were “born” that way, or got there recently through web analytics. So the $64,000 question is this: if your company was not “born” that way, born with the “failure is a learning experience” mentality, how do you get there?

    So enters web analytics, the “model” as it were, the virus that could infect the rest of the corporation. The culture of failure as a learning experience is embedded in the web analytics model, at least to the extent of “best practices”, if not in most implementations. So let’s take a look at it, why did this happen?

    There are probably several reasons, the primary being “because they could”. In other words, they had the data, organized in some fashion, to be able to actually do something, to be able to engage in the continuous improvement process. It’s worth noting that after what seems to be an eternity, some companies are just beginning to collect this type of data in other areas of the business, for example, through CRM implementations.

    Another reason is low risk. As a percentage of the entire business, many of these web operations were quite small, and so rebuilding the site and testing carries fairly low risk to the corporation. “Nothing to lose”, as it were, to get out there, test, and learn. And clearly the success stories from these initial efforts drove others to give it a whirl. Since the process for optimizing a web site was made pretty clear by a lot of the early practitioners, and IT folks in general love a good process, you had a generally low overall failure rate. Step 1, Step 2, etc.

    A third reason would be that most sites were terrible in the first place, for reasons that are pretty well known (lack of clear Objective, IT-driven implementation, etc.). Lots of upside in simply doing something about it, don’t you know.

    So these are some of the primary reasons why the web analytics culture ended up with no fear of failure, or the failure as a learning experience concept embedded in it. There was no “culture” to begin with, and so the culture provided by the early practitioners – driven by the success of following the optimization process – was accepted and embedded. As part of this, everyone from Finance (ROI) to Customer Service to Fulfillment was part of the game, as the web site touches (or should touch) so many aspects of the corporation. These folks constitute the primordial cross-functional Business SWAT teams, but buried in traditional corporations, that frequently already exist (like the ones I have been part of) in the more data-driven analytical business cultures. And there is no reason to believe this same concept couldn’t work on larger efforts than the web site alone. These are the folks that should be driving process improvement throughout the company.

    Following the model above, the challenges for the rest of the company will include:

    No data, or unclean data, or inaccessible data. This is really changing rapidly, finally. Data quality is one of the hottest enterprise topics right now.

    As to risk, unlike for web analytics, the risks when improving human-facing processes (dare I say Customer Experience?) for the corporation are probably quite high, at least operationally. You could create a pretty big mess if you’re not careful, witness CRM and the automation of worst practices. But there is a proven way to do it – slowly, incrementally, with a broad definitive roadmap but a “chunky” execution.

    As for current processes being “terrible”, well, I think companies are finding out more and more each day this is true. I don’t think a concept like “Customer Experience Management” could gain traction if this was not true. But as Ron pointed out, this idea is being hijacked into places it should not go and will die if (we?) don’t claim it and give it a framework, or come up with a different concept that is more closely aligned to what we are trying to do. The Business Process Management (BPM) folks are probably on the track I think is closest to what we’re talking about. In a way, BPM as a concept is web analytics for the corporation. But instead of being applied in a low risk, high reward environment, the BPM folks are facing just the opposite scenario.

    And, I’m not sure there are many Marketers involved in the BPM movement (yet?) so there is the same risk here you had with web sites – the engineers build it for them, as opposed to users. The question of what is the right outcome for an optimized process in anything customer-facing is a Marketing question, not an engineering question. The engineers, for example, are probably more sensitive to evaluating the outcomes based on short-term metrics (the process completed without error) as opposed to longer-term metrics (the value of the customer base increased). 

    For starters, every customer-facing interface ever built, from VRU / IVR trees to Customer Service scripts to ATM’s really needs to go through a “web analytics style” optimization process.  Seriously, think about it.  Self service checkout?  Can my mother use it?  And while we are at it, every sign, invoice, order fulfillment package, confirmation e-mail, and all the rest of the customer-facing stuff needs to be vetted by Marketing as well.  A model for how to approach this task, both online and off, in an integrated way, has already been put forth.  But say you believe in the Waiting for the Cat to Bark model.  When you try to bring it into the company, how are you going to implement when you are surrounded by people who live in a culture where everybody fears analysis and accountability?  Where will “the faith in analytics” come from?  Maybe from BPM. Or maybe CRM can be used to provide the success stories that drive BPM.

    No matter how this all shakes out, the fundamental problem is metrics and analysis are coming to the corporation, yet no one is preparing the corporation for this, creating the proper culture. Just look at the failures of CRM in sales force management, for example, which are legendary. And why? Because the proper analytical culture was not in place first. Every time you hear the word “accountability” it’s really begging the question of analytical culture.  You are right when you say, “Who can go to his or her boss with a large smile and say “Boy! Did we miss the target on this one!”?”  But that is part of the culture question: the second business rule after “test everything” is when you test, test small, so your failures have a small financial impact.  This idea doesn’t occur to a lot of the “shoot the moon” folks in marketing where the tendency is to go “all in” and making / admitting to a mistake is a real disaster.  The analytical culture includes a set of best practice rules to follow, but outside companies with an analytical culture, few know what those rules are.

    And that is the challenge – how do we make sure our people are ready for all this? How do we get Manic-ment to understand this problem is tangible and real? How can we prove that just about any initiative they have underway having to do with “accountability” or improving processes will have a much higher success rate if the proper culture is in place first?  What is the correct mixture of Change Management, Human Resources, and Training that will be required to ensure companies can succeed in pulling this transformation off?

    Like I said, that is the quest…

  3. If “analytics” (web or otherwise) is to succeed as a “virus” (bad connotation, Jim) in today’s organizations, it’s far more likely to happen as a part of Evolution than Revolution. The Consultainers love to hype the latest hot thing, cuz, well, that’s how they make their money.

    But disciplines and management improvements usually don’t take root as the result of big bang occurrences. They seep and ooze into organizations, until one day you wake up and realize they’ve become part of the fabric of the company.

    I believe analytics WILL succeed as the “virus” you describe, Jim. But — perhaps somewhat controversially — not as the result of many of today’s Web analysts or marketers. Instead, I think it will be the financial people — the CFOs and their organization that will pick up on the analytic virus and drive it throughout their firms.

    Why? They have the power, they have the credibility, and they have the quantitative background to understand this virus.

    Why not marketing and web analytics? Marketers are in the midst of a cold war — the branding gang versus the direct/quantitative gang. You can’t change the rest of the world when you’re in the middle of a civil war.

    p.s. I can’t believe you buried everything you said in a “comment” — it was very post-worthy.

  4. Ron, I am going to steal the “cold war” thing, it’s very apt…and funny.

    I’m on board with the CFO being the one to pull the trigger, after all, this “Productivity” discussion is all about the money. I just think it will go a lot more smoothly if the Marketing / Analytics folks get their ducks in a row and start the conversation with Finance, be a little aggressive about it. Otherwise, they may find themselves permanently banned from the BoardRoom, replaced there by IT and the Chief Customer Officer, and thus relegated to making pretty ads, buying media, and writing press releases – the most boring parts of Marketing.

    There’s some evidence (I think?) that those Marketing folks who step up to the Productivity plate are actually rewarded with bigger budgets. They don’t really come right out and say it in the article (and I wonder why not?), but the implication is there.

  5. Jim,
    I sincerely hope the Chief Customer Officer doesn’t replace the CMO or VP of Marketing in the board room.

    The really good Chief Customer Officers I’ve worked with and written extensively about serve a very different but complementary function. They need to know what the customer needs, wants, and is willing to pay for better than anyone else. Marketing needs to apply their strategic and tactical prowess to figuring out how to profit by delivering customized products and services that exactly meet these needs and wants.

    Marissa Peterson, former CCO of Sun Microsystems, owned much of the analytics as part of the Sun Sigma process. Jeff Lewis, former CCO of Monster.com, created a very small team to analyze customer data. These two CCOs recognized that their every success depended upon knowing their customers more solidly and completely than anyone else in the company.

    As you so aptly said above, research and analytics are here to stay.

    Curtis Bingham
    Author of the Annual CCO Report
    http://www.predictiveconsulting.com
    http://www.curtisbingham.com

Leave a Reply

Your email address will not be published. Required fields are marked *