The Accuracy of Web Analytics


When it comes to marketing everyone is out to compose the most compelling message possible and then measure an audience’s reaction to that message. The question then becomes are measuring the tools that web marketers use reliable? I think the general consensus is that many of the tools that rely on JavaScript are less reliable than marketers would like them to be.

Stone Temple Consulting has performed a quality test showing that where one locates the JavaScript that our analytics tools rely on matters. The two things that the test showed were one that page load time matters and two that the amount traffic to a server can adversely affect page load time. They had to take both of these concepts into account when performing their test.

Their test for all intents and purposes showed the effects of having JavaScript located at the top of a web page and that of the same script located at the bottom of the page. Their conclusions seem to be logical but are well worth reviewing, especially in relation to your own website’s performance.

Our suggestion would be that if your page load is normally pretty snappy (total page load time of under 3 or 4 seconds), then keep your Javascript at the bottom, and remove any risk related to analytics vendor downtime. The small loss of data you see in this scenario should not be a significant factor in the value of your analytics data.

But, if your page load time is a bit slower (4 seconds or more), you may want to consider placing your analytics Javascript at the top of your web page. Your data loss will be larger, and also the nature of the lost data may start to differ.

Tests like these remind marketers why it is important to consider running two different analytics solutions in parallel. Examine the statistical differences of the same data over a period of days, weeks, and months and then watch to see if the reporting trends hold true. Actually attempting to reconcile data is nightmare and probably something you don’t want to do because the numbers are never going to match 100%, as there are still some basic flaws with any form of analytics acquisition on the web.

So my question for the readers of this blog is how much do you trust the analytics data you are reviewing on your own sites? Does this type of test make you want to reconsider the load times of your site as well as the position of the JavaScript on your site?

  • http://coolslko.blogspot.com Neeraj Srivastava

    Really load time is so important for a website and we always try to keep it less and not more than 3-4 seconds but earlier I was not aware about that we should insert analytics javascript according to load time….Really great tips and helpful too….thanks for sharing

    Neeraj Srivastava’s last blog post..Useful Tips to Counter Web Spam-Matt Cutts

  • http://forums.webexcelsolutions.com/ SEO Forums

    That’t indeed cool tips for inserting analytics codes…..really I thankful to you…

  • http://www.newhomessection.com Jayson

    Thanks for the information – we’ve never fully trusted our analytics; we usually just accept that they’re wrong but hope that they are consistently wrong in the same areas.

    We track traffic etc.. using 3 different programs and they’re all significantly different.

    Yeah, I’m going to check on our load times and code location and see if we can improve in some areas.

  • http://www.webdesignmedia.co.uk Ralph

    We also put the analytics code completely down of the page, great tips for analytics.

  • http://www.exposedseo.com seo dude

    And then there is always browser compatibility problems. If the browser does not support java or has it disabled then that person will never get picked up by analytics. This is why you often get different results with analytics compared to your server based stats.

    seo dude’s last blog post..A few quick updates

  • Pingback: Links do Dia: 06.05.08 « Dissonância Cognitiva

  • http://www.searchengineoptimizationjournal.com Search Engine Optimization Journal

    Great Post! There are many great analytic programs out there but they often show vastly different data due to different technologies.

    Search Engine Optimization Journal’s last blog post..How Social Media Can Help Boost Your Search Engine Optimization Strategies

  • http://seologia.com.ar Seologia

    What about placing the code in the middle or bottomish-middle?

    Seologia’s last blog post..El algoritmo de Google no es perfecto

  • http://www.timeatlas.com Anne H

    Nice reference. If people are looking for more ways to speed up their site, I would highly recommend the YSlow plugin for Firebug.

    The tool will give you an overall score and subtotal for different areas. One suggestion does concern javascript placement. It gives some great advice on speed rules and links to some work done by Yahoo!s performance team.

    The tool only works with Firefox. If you use IE, you can download AOL’s pagetest.

    I’ll warn you that if you don’t use a content delivery network (CDN), your score will be lower. I wish it had an option to turn that rule off. Otherwise, I think the tool is great.

  • http://prosperitywriter.com/ Prosperity Writer

    i tested my page load time with a friend from the US. it was a bit slow so i will move my analytics code

  • Brian

    Intensive purposes?

  • http://www.vizioninteractive.com/management-team/ Mark Barrera

    Whatever happened to log file analysis?

    It seems that if you want complete data, just analyze the logs…

  • Jordan McCollum

    @Brian—I’m guessing that should be “intents and.” Thanks for the heads up.

  • Pingback: Paths Forward (05.06.08) — friday night running

  • http://www.webdesignmedia.co.uk Ralph

    What would be the best to use for analytics that makes things faster? Google analytics is very slow..