Select Page

Carl Bialik of the Wall Street Journal recently covered the business of web traffic measurement and how the data gathered by services like Compete, Quantcast, comScore, Omniture, and Google Ad Planner differs wildly.  I’d like to shed a little light on why measurements from these big-name firms can differ so wildly and how you can still make sense of the world of online advertising even if the numbers are less than solid.

First, a little background.  Measuring web traffic is a notoriously difficult tasks, so much so that no accepted standard of measurement exists.  Instead, measurement firms compete for the business of publishers, each claiming to have more accurate data than the others.  But as Bialik points out in his piece in the WSJ, these numbers differ wildly.

To demonstrate just how wildly these measurements can differ, I plugged a few data points into Compete, Quantcast, and Ad Planner, services that make their traffic data readily available.  Here are the US monthly uniques for NYTimes.com, WSJ.com, and Wired.com on these three services, rounded to nearest one hundred thousand visitors:

Compete Quantcast Ad Planner
NYTimes.com 17.1M 12.6M 18 M
WSJ.com
10.9M
4.8M 6.7 M
Wired.com
5.5M
1.9M 2.9 M

It’s plain to see that these numbers differ wildly, but it’s also crucial to notice that these numbers don’t just differ in absolute terms, but also in comparative terms.  By that I mean that according to Compete, WSJ.com receive about 60% of the visitors that NYTimes.com does, while Quantcast’s measurements show that WSJ.com receives only 38% of the NYTimes.com’s uniques—so it’s hard for advertisers to tell how large an audience a publisher has, even comparatively.

What causes these discrepancies?  It all has to do with the nuts and bolts of how the measurement of traffic is performed by each service.  Some of the measuring services are utilizing cookies—files saved on a user’s machine that the site will later reference to note repeat visits.  But using cookies can be troublesome because users can delete them daily, weekly, or monthly depending on how they’ve configured their browsers, leading to very unreliable data.  If cookie deletion rates were consistent across audiences, perhaps this wouldn’t matter, but it’s likely that readers of Wired.com—a tech savvy bunch—may be more aware of their browser’s cookies and other settings than others, skewing Wired.com’s traffic measures.

Still other services using javascript—code that contacts the measurement service’s servers to log a pageview—but javascript is often placed at a bottom of a page’s source code, causing it to run only after the rest of the page has been loaded into the browser.  For users who are quick to move on to the next page, javascript may never see the light of day.  Again, tech savvy come into play here as they may manually disable javascript, making them impossible to count.

Servers logs confuse this issue even futher.  Slate’s Paul Boutin wrote a great piece on this topic in 2006 which Bialik links to at the end of his WSJ piece.  Boutin compares 3rd-party measurement to Slate’s server logs and explains the huge discrepancies in those numbers.

All of this paints a pretty clear picture of an industry that really can’t measure itself in any reliable way.  But my reaction to this is a big “So what?”  Even if a precise number were available to advertisers, they would still have to engage in a process of trial and error to determine which sites provided the greatest return on investment.  Some advertisers may experience a larger ROI from sites with relatively small audiences, while others may get better results from the sites that consistently post larger numbers.   Either way, the notion that any measurement of audience size or demographics can be an accurate predictor of advertising effectiveness is ludicrous.  The real measure of success comes from linking advertising dollars spent to sales, a problem that exists in every field of advertising.

So, if you’re venturing into the world of paid advertising for your business or non-profits, use 3rd party measurements as a way to narrow down your options, not as the final word on where to put your advertising dollar.  Using unique landing pages or referral codes to help track the sources of new visitors or sales will be a much stronger indicator of where to concentrate your cash than any number that comScore or any other service can provide.

HT: Thanks to Adam Thierer, my co-blogger at TechLiberation.com, for pointing out Bialik’s piece.