Posted May 2, 2007 6:02 pm by with 9 comments

Tweet about this on TwitterShare on LinkedInShare on Google+Share on FacebookBuffer this page

Are there parts of your site that you wish you didn’t have because they dilute your chances of getting a number 1 ranking on the search engines? Have you spent endless hours re-working your CSS, so that you main content appears ahead of your navigation, in hope of improving your search ranking?

Good news! Yahoo wants to help you filter out the content that doesn’t help your optimization efforts, with the introduction of the Robots-Nocontent tag.

…webmasters can now mark parts of a page with a ‘robots-nocontent‘ tag which will indicate to our crawler what parts of a page are unrelated to the main content and are only useful for visitors. We won’t use the terms contained in these special tagged sections as information for finding the page or for the abstract in the search results. Note: Using a “nocontent” tag to mark explicit sections of content is not considered “cloaking” because all of the content on the page is available to protect the relevance of the results (unlike “cloaking” where we may be served content that is different from what visitors see).

Spammers? It’s yo’ birthday, yo’ gonna party like it’s yo birthday!

Is it just me, or is Yahoo just asking for trouble with this? If you strip out images, code, sidebars etc, and let Yahoo only index the “good stuff” isn’t that the same effect as cloaking?

And of course, the cynic in me is wondering what back-door use will be imposed on us in a few months. Remember how innocuous the introduction of “nofollow” was?

  • Whatever happened to the days of “act as if the search engines didn’t exist?” Can we get back to those please? Between nofollow, nodp, and now nocontent, it’s getting a bit ridiculous. I want a nocare tag.

  • I have to admit I didn’t think anyone could dream up anything more stupid and useless than “rel=’nofollow'”, but Yahoo! has obviously outdone Google in Stupid Search Indexing Tricks.

    How long will it be before Rand Fishkin is told at some conference he needs to start excluding certain portions of his on-page content or face penalties in the Yahoo! index?

  • Pingback: Robots-Nocontent Tag Introduced » SELaplana()

  • I’m still a firm believer of search engines making better, smarter algorithms instead of annoying bloggers and webmasters with their new “rules to make their job easier”.

  • nice. now, we can remove those posts or content where spammers gone wild.

    actually SEO, I do think it is just an option for blog owners and site owners.

  • I think the back door targets will be those who sell on, or offer aff content with no inhouse payment processing. The message will be clear, use our tags or risk deselection. More FUD, here we come.

  • This is a cool method for filtering.

  • And people had better start expanding their under-the-hood checking for all those links they are out there buying, swapping, and begging.

  • Pingback: » Webspam2.0 und viel Erotik (-Bier) | seoFM - der erste deutsche PodCast für SEOs und Online-Marketer()