Posted September 18, 2009 1:34 pm by with 8 comments

Tweet about this on TwitterShare on LinkedInShare on Google+Share on FacebookBuffer this page

Two years ago, Yahoo heard requests from SMX Advanced attendees and added a Dynamic URL Rewriting feature to Yahoo Site Explorer, allowing webmasters to remove parameters from URLs, like session IDs, that might create duplicate content issues or otherwise confuse search engines. Now Google’s finally catching up. As Exoogler Vanessa Fox reports at search Engine Land, Google has added a feature to strip parameters from URLs to its Webmaster Tools.

With extra parameters, search engines can index the same content under multiple URLs, which may split link equity. Vanessa also mentions that indexing parameters can hurt crawl efficiency as well as display and branding.

Seven months ago, Google, Yahoo and Microsoft (at the time) signed on to create a canonical URL tag, to indicate what URL is the “correct” version, without parameters. In her article, Vanessa compares these and several other methods of eliminating search engine performance interference from parameters—and each has its drawbacks. The new Webmaster Tools option, for example, might be used to accidentally close off portions of your site to search engines, and only works for Google.

On the other hand, although the canonical URL tag requires the page to be crawled before it can be indexed, and might thus cost a little in initial crawl efficiency, for most cases, it will probably be the easiest to implement and most universal solution. Like the Webmaster Tools option, it can be done incorrectly (Vanessa says some sites accidentally indicate the canonical URL for each page on the site is the home page). Currently, she says, Google is the only search engine actively using the tag anyway—so at present, the universal benefit is moot.

What do you think? Will you use Webmaster Tools to strip parameters from your URL, the canonical tag, 301 redirects, or another method? Or do you see little harm from extra parameters in your URL?

  • I would prefer using the canonical tag. I like the idea of knowing that I can only screw up one page at a time 😉 , and of course ultimately this will be the most universal.


  • I’ve just implemented Google’s Webmaster suggestions yesterday; it seems a good move, anything to make spidering and indexing the site easier; but I’ll see the real consequences over the coming week – you can always reset.

    With the canonical, I just implement 301s as there’s enough Meta tags in the header already.

    Thanks for the article, as always well up to date.
    .-= Recruiter´s last blog ...Net Team Lead / Technical Architect / Development Manager (DC4859) =-.

  • Pingback: How to Create Unique Content from Scratch | The Logicfish Blog()

  • This is a noce blog full of nforamtive stuff. My comment would only be that I’ve been playing around with the URL parameters Google has to offer a lot lately, mostly after the de-personalized search stuff.

  • This is new and nice feature of google. This new feature, whilst helpful with its additional insights into how Google sees you website, certainly doesn’t provide the best method of reducing or even completely eliminating duplicate content issues, unless perhaps you’re an SME. Also from Google’s perspective, by getting sites with duplicate content issues to at least reduce the amount of it, it will instantly free up their computing and crawler resource.

  • Pingback: Bloggers Digest Resurrected: Ecommerce Links for September 2009 | Small Business Software|News, Reviews and Resources!()

  • if search engine bots crawl the same page via multiple URLs, they may not have resources to crawl as many unique pages on the site PageRank dilution that can lead to lowered search rankings: if external sites link to multiple versions of a page, each page has less Page.

  • Pingback: Bloggers Digest Resurrected: Ecommerce Links for September 2009()