Publishers to Spend Half Million Dollars on a Robots.txt File
Good grief, Reuters is reporting a group of publishers are planning to spend $583,700 to find a way to prevent Google from indexing their content!
“Since search engine operators rely on robotic ‘spiders’ to manage their automated processes, publishers’ Web sites need to start speaking a language which the operators can teach their robots to understand,” according to a document seen by Reuters that outlines the publishers’ plans.
Google already offers a way for a publisher to have their content removed. In addition, modify your Robots.txt file to disallow the search engines and you can donate that money to a charity that helps paranoid publishers! 😉