It’s an interesting read, especially when the publishers concede they’re getting a lot of traffic from Google then go on to use that as evidence the company needs to be stopped.
Since the Belgian court decision went into effect and Google dropped IPM publications, traffic to the company’s sites has dropped about 15%, le Hodey concedes. Yet that only strengthens his sense that Google should be checked before it gets even more powerful.
BW also gives us an explanation of what the Europeans are trying to create with their Robots.txt alternative.
…a set of sophisticated software “tags” readable by search engines’ Web crawlers that would automatically tell aggregators under what terms they can use editorial content.
Why doesn’t Google just remove every European publishers content from their index, then they can spend their time creating a tag that will beg them for mercy.