Posted May 30, 2008 2:00 pm by with 3 comments

Tweet about this on TwitterShare on LinkedInShare on Google+Share on FacebookBuffer this page

By Michael Kraft

Versioning has long been a necessity in forming online marketing strategy and making correct business decisions. Versioning helps to answer questions like:

  • What is the best price to charge for your products?
  • Which page layout is the best for conversions?
  • Which options should I present my customers with?

Most companies need to version their pages to find answers; the question then becomes, how do you version your pages in a way that doesn’t affect your organic search results, or better yet, improves them?

To answer this you must first evaluate how you are versioning your pages. Using a dynamic programming language like ASP or PHP, you can serve your users different versions of each page on your site in order to compare the effect of each tested page, price or feature. Cookies can be used to track which of your users have visited the site before, so that you can feed them the same version of the page to ensure a consistent site experience.

Does Google Hate Versioning?

Once you are serving several page versions on your site, how does Google index them? Once Google manages to index them, which version is getting indexed and how is the new page(s) ranked in the SERPs?

When Google crawls your site and comes to a versioned page, the bot will be randomly served one of the pages to cache. Google will then cache the version it is served even though there are more versions to the page that Google can’t see at all.

Wait… Isn’t That Cloaking?

Although similar, this process isn’t considered cloaking at all. Cloaking refers to the process of fooling search engine bots by serving users one thing and search engines another. Versioning, however, isn’t; Google even encourages multivariate testing with Google Website Optimizer.

Note: Purposefully serving search engine bots the same page simply because they are bots begins to look more like cloaking and less like legitimate multivariate testing. IP delivery has been a border line issue with Google and one I have decided to stay away from.

Warning! Versioning Can Hurt Your Rankings!

When search engine spiders crawl your site there is no way for them to take into account the different pages being versioned. This means that as the versioned page gets indexed repeatedly over time, and different versions are indexed, the page will appear as though it is constantly changing. Having content changing frequently has implications for your rankings. For example if one page has a keyword targeted title tag while the other simply says “Welcome to Billy’s”, there is likely to be substantial differences in ranking for that site’s competitive keywords. This WebmasterWorld thread has a lot of good information on the potential downfalls of dramatically changing content on a frequent basis or, in this case, having different versions of a page being cached by Google on a frequent basis.

Will My Pages Drop In Rank?

The sites I work with often jump around in rankings because of the versioning complications I discussed. When one of our top keywords dropped four positions we started looking into why this happened. At the time I chalked it up to the legendary Google Dance, but really it was due to versioning. In that case we were serving three different versions of one our pages. As Google crawled our site over time, it cached the different versions of the page it was served; and because the pages had some differences in on-page SEO factors, we started to see some movement in our rankings.

Google Help states that “the cached content is the content Google uses to judge whether [a] page is a relevant match for [a] query.” This means that when it caches the ‘new content’ (actually just a different version), the rankings will be adjusted accordingly. This makes clear the importance of having a SEO checklist in place when versioning your pages. Otherwise, when Google crawls your pages that are being versioned, you could see major movements in rankings as the different versions are inevitably cached.

Versioning Successfully

  • Keep title tags the same on each version of your page.
    • Having different titles increases the likelihood your rankings will drop as title tags are one of the most important factors search engines use to rank
  • Keep content relatively similar
    • Changing the theme or basic meaning of the page could have negative effects. This means that adding content, rather than changing the content might be a better strategy in keeping content fresh.
  • Use Identical alt text on images
    • Googlebot doesn’t just look at the surface appearance of your site and will notice any content differences on your page; not paying attention to images could lead to a drop in ranking.
  • Use caution when using re-writes and redirects
    • If you are using re-writes and redirects, make sure they are done correctly and are not wasting link juice.
  • Be careful when making updates
    • If updates, changes or tweaks are needed for these on-page elements, make sure to change them in all your versions.

Following these tips will allow you to tweak your pricing, layout and options and drive conversions by mitigating the risk of losing your top SERP positions. This will hopefully help all of you to make changes without affecting your rankings.

This is an entry to Marketing Pilgrim’s 3rd Annual SEM Scholarship contest.

  • jonathan davis

    Good Article, is it sad that I also got out of this article that it’s good to version pages in order to improve conversion rates, and not just that it needs to be done carefully for SEO? 🙂

  • One thing to remember is that Google Website Optimizer, Vertster, and other smart testing tools use JavaScript to serve up different versions. Search engine bots DO NOT process JavaScript, and so they only recognize the ORIGINAL version. So, with a smart testing tool, the test will not mess up your SEO rankings. Only after you finalize the test and deploy your new conversion-optimized version will most search engines recognize the change in content. Taking this consideration into account, good article!

    Roy Furr’s last blog post..Instant Profit Boost with Google Website Optimizer

  • Michael Thompson

    If Google Optimizer is ok with serving search engine bots the original version repeatedly despite showing other users different versions, I wonder how they differentiate between websites doing versioning and websites that are cloaking?

    It seems like search engines would definitely not be ok with serving their bots a ‘seo friendly’ version while serving users a completely differnet ‘conversion friendly’ version.

    Maybe there is some good faith given by SEs when they believe legitimate testing is going on?

    Very interesting, good comment, good article!