I attended the Web Analytic Association’s symposium in Boston not long ago. For a good overview of the event, see Dean Westervelt’s recap on the Metrics Marketing blog. My motivation for attending was two-fold: 1) Picking up some nuggets to add to the Online Marketing Maturity model that I’ve developed, and 2) Seeing Tom Davenport speak.
Tom is the author of Competing on Analytics and a leading light in a number of management innovations over the past 20 years. I first heard of Tom in the early 90s when he wrote a Harvard Business Review article about business process redesign. He later wrote about knowledge management, and now champions the use of analytics for management decision-making. When Tom writes, I read. I can’t say that about a lot of folks in the business world.
The gist of Tom’s presentation is that we’re in a “new quantitative” era. Businesses need new decision approaches, approaches that will be driven by enterprise analytics (which are comprised by web analytics, marketing analytics, supply chain/OR analytics, HR analytics, predictive analytics, etc.).
Tom told the web analytics folks that they need new skills. They need to fix a problem — not just identify it. They need to tell a story with data, help frame decisions, and stand firm when necessary. According to Tom:
“Analytics without plans for decision-making is a waste of time.”
I couldn’t agree more.
So it struck me as a bit off-message when Tom displayed a graphic (shown below) that purported to show analytical intelligence and maturity (other citations I’ve seen of this model label the X axis as “degree of intelligence”).

———-
My take: This model isn’t correct.
A little context on why: I often talk about a “new” competency that marketers need: A sense-and-respond competency.
Specifically, marketers needs to sense where customers and prospects are in the buying cycle and to respond with the most appropriate message.
The problem with my idea is that it’s not a new competency. It’s always what marketing has done. Tried to sense what consumer needs and wants are, and to respond with messages, offers, and actions.
What is different about marketing in the 2000s versus 30 or 40 years ago is that the amount of information available to fuel sensing activities has proliferated. Instead of relying on demographic and (internal) purchase data, the amount of data available to marketers has become overwhelming.
The respond side of the model is also very different. Not only are there many more channels or touchpoints with which a marketer can respond in, but the timing of those messages, offers, and actions can be made much more rapidly than in the past.
So why is Davenport’s model wrong?
Because it ignores the respond side of the sense-and-respond construct. Simply developing optimization or predictive models does not make you a mature marketer if you haven’t figured out how to take the output of those models and respond effectively and in a timely manner.
Ironically, based on the comment I cited above, Tom knows this very well.
Tom also commented, when presenting this model, that web analytics was mostly focused on the bottom half of the chart. I joke-tweeted at that point that Tom had pretty much ensured that he wouldn’t get invited back to next year’s symposium. Another attendee responded that I was wrong, and that web analytics agreed with Tom, and they were “getting there” in terms of moving up the model.
That’s all well and fine. But it might be the absolutely wrong thing to do.
Analytical maturity isn’t a function of how “intelligent” your analytical approaches and models are. It’s a function of:
1. Alignment. Specifically, the alignment of analytical approach with the type of decision that needs to be made or the issue or problem that the firm is facing. Not every business problem requires an optimization or predictive model .Would you want to take a less-than-intelligent approach to solving a business problem? Of course not. That’s why drill-downs and alerts are not less intelligent than a predictive model. It all depends on what problem is being addressed.
2. Sense-and-respond ability. Queries, drill downs, and alerts might not be high on Davenport’s intelligence scale. But they can often be accomplished more quickly than a firm can develop, test, and implement a predictive or optimization model. Often, it’s the fastest response that wins, not the most elegant.
When I first thought about the sense-and-respond construct, I thought there was a third component: Assessment. I thought “first we sense, then we respond, then we evaluate or assess our actions.” But upon further thought, I realized that wasn’t quite right. Assessment/evaluation is just another form of sensing. It’s a continuous loop — we sense, respond, sense, respond, etc. So a contributor to how mature your analytics capability is depends not just on whether you respond timely, but how well you assess how effective your analytics are, and can recalibrate and adjust your models and approaches.
3. Culture. I know of some large financial services firms with teams of statisticians who develop (and implement) predictive and optimization models, who look down on web analytics efforts as somehow being inferior to their highly statistical efforts. An analytically mature organization doesn’t have this problem.
4. Data usability. In the list of factors determining analytical maturity, this is certainly not the last one that should be mentioned, but the immature use of data often stems from a cultural immaturity. As I mentioned above. plenty of firms hare developed predictive and optimization models to drive their marketing efforts. But many still use a relatively narrow set of data to power those models. Specifically, they often do not incorporate web-based data. An analytically mature marketing department uses a range of data sources, and gathers data for their analytical efforts more effectively and efficiently than a less-mature firm.
Bottom line: I really can’t speak to analytics efforts outside of the financial services world, but among financial institutions, there is an analytics gap. The gap is that highly sophisticated statistical approaches to analytics are being undertaken, but often in a very untimely manner and with a limited set of data. On the other hand, while web analytics efforts are often more timely, and incorporate a more efficiently gathered set of data, these efforts haven’t grown beyond relatively simplistic analytic approaches.
I don’t think the web analytics folks can bridge this gap by themselves.
When my fellow Symposium attendee tweeted that web analysts were “getting there” in moving up the (so-called) intelligence scale, I couldn’t help but wonder: 1) How is that going to happen? and 2) Why hasn’t it happened before?
Here’s the problem, as I see it: Unless you’re a web analyst with a good understanding of statistical approaches AND a good understanding of business problems, decision making, marketing, and organizational politics, then I’m not sure you’re well equipped to move web analytics up the maturity scale.
I’m not saying that aren’t web analysts out there who meet this criteria, but looking at this from the other end of the spectrum, I can’t say I’ve met a lot of statisticians with a good understanding of web analytics AND a good understanding of business problems, etc.
So who’s going to bridge the gap?
My bet (hope?) is on the vendor community. When I look at what IBM is doing with its acquisition of Unica, SPSS, and Coremetrics, or an Adobe with its acquisition of Omniture (who had previously acquired Offermatica and Touch Clarity), I see the pieces coming together. The part of the maturity equation that I’m not sure the vendor community can address is the cultural component.
Whatever happens, it’s an interesting time to be involved in marketing analytics.