Can We Have Some Metrics On How Effective Metrics Are?
from the seduced-by-numbers dept
There’s just something about “metrics.” I spent Monday at the Word of Mouth Marketing Association’s Research Symposium, which was supposed to be all about learning how to track the impacts of word of mouth marketing attempts. Of course, that makes an implicit assumption: that you can, need or want to track those things in a simple manner. It seemed like the key point made by many of the attendees (mainly from big companies) is that they need to figure out how to translate “WoM” campaigns back into the common terminology and “metrics” they use in traditional marketing in order to convince CEOs and marketing VPs that word of mouth campaigns make sense. Metrics was the word of the day, with multiple people saying that without metrics, word of mouth marketing was effectively pointless. There’s just one problem with this: metrics can often be worse than useless. Metrics are only as good as what they’re measuring, and they’re often measuring either the wrong thing, or something that can easily be gamed. A month ago, Joel Spolsky had a fantastic post on the ridiculousness of metrics, and how they’re often used mainly by consulting firms to convince big companies to funnel money into the consultants’ bank accounts.
With all of that in mind, it’s not that surprising to see a story noting that web analytics firms (who have been at this for quite some time, and you would have thought had the kinks worked out) are revising the traffic certain sites got downward, often by tremendous amounts. Apparently, the website Entrepreneur.com didn’t have 7.6 million visitors (as reported), but just over 2 million. That’s not a small error. The problem, it seems, is that the site put news content into unwanted pop-up ads, often delivered by adware on lots of computers. In other words, the pageview may have happened, but it wasn’t wanted, and it’s quite likely that it was closed before it even loaded (perhaps with anger). In other words, it was a metric that was gamed. Keep this in mind when you hear silly reports like MySpace passing Yahoo in pageviews according to various metrics companies. As the article linked here notes, part of that may be due to Yahoo shifting their web mail app to AJAX, making it much more efficient and user friendly — but requiring fewer page views. Equally responsible, however, would be MySpace’s awful design that requires extra page impressions to actually get anything done. In other words, in focusing so much on the “page views” metric, the incentives encourage bad design. The companies that win the metrics game are the ones who are designed in ways that upset users, rather than those that make their service better, faster and more efficient. Normally, that would be a clear indication that the metrics are all wrong — but very few people seem to care. They just continue focusing who is ahead of whom on the list — and plenty of bad decisions continue to get made because of this blind allegiance to metrics.