Publishers to Spend Half Million Dollars on a Robots.txt File




Good grief, Reuters is reporting a group of publishers are planning to spend $583,700 to find a way to prevent Google from indexing their content!

“Since search engine operators rely on robotic ‘spiders’ to manage their automated processes, publishers’ Web sites need to start speaking a language which the operators can teach their robots to understand,” according to a document seen by Reuters that outlines the publishers’ plans.

What the..?

Google already offers a way for a publisher to have their content removed. In addition, modify your Robots.txt file to disallow the search engines and you can donate that money to a charity that helps paranoid publishers! ;-)

  • http://www.sierrawebmarketers.blogspot.com Mike

    Wow, someone talked them into that deal?
    If people want to throw away money I’ll take some.

  • http://www.seoposition.com Brian Gilley

    I could’ve hooked them up for about $20 since it only takes a whopping 5 seconds to do.

    That’s ridiculous!

  • http://ethosplanning.com Paul Drago

    I guess the World Newspaper Association can’t even ask their own webguys since they haven’t bothered with a robots.txt file either.
    Yet, amazingly our local paper has one. New York Times has one.
    Hell even the Clarion Ledger in mississippi has one! (though # be nice. isn’t a command I am familiar with :) )

  • Pingback: I’m Going to Shoot Myself in the Foot | inter:digital strategies()

  • http://www.tupac-amaru.com Abstroose

    $500,000?! Wow, I possibly don’t see what you could want in a Robots.txt file worth that much?