Posted December 20, 2006 10:13 am by with 7 comments

Tweet about this on TwitterShare on LinkedInShare on Google+Share on FacebookBuffer this page

Rand pointed to Adam Lasnik’s – yep Dr. Google himself – guide to avoiding duplicate content filters at Google. I know some of you are thinking at this point, “duplicate content is Google’s problem, not mine”, but Adam has some great advice.

One suggestion that caught my eye…

If you syndicate your content on other sites, make sure they include a link back to the original article on each syndicated article. Even with that, note that we’ll always show the (unblocked) version we think is most appropriate for users in each given search, which may or may not be the version you’d prefer.

Marketing Pilgrim’s content is syndicated in a few places, most notably Web Pro News. I’m often asked whether allowing other sites to syndicate blog posts results in duplicate content filters. I’ve never experienced it – MP’s content typically is favored in Google over the WPN version – and now I know why. Whenever someone asks if they can syndicate my content, I always insist on a link back to the post or site.

  • Dupe content filters are too strict at google.
    I think google is going too far by forcing people to pease them with unique content on one static url that has to have a very unicue title.

    Thank you for sharing this stroy with me !

  • While I haven’t experienced this particular issue, I have noticed that since switching to a new URL setting (where the URL names the pages instead of just providing a numbered URL) my pages show up twice in Google. Of course I set up redirects from the old pages to new pages, so I guess that’s why. I wonder if there’s a way to avoid this.

  • It’s reassuring to know that provided the link back is there you’ll be okay, though I have seen examples where if a high authority site syndicates a low authority site’s article and appear much higher up the rankings, but thats not hugely surprising in the examples I’m thinking of.

  • I have spent many hours rewriting my links to be unique for google. I have to agree with Lola…the filters seem to be somewhat to strict. It seems like everytime I have created unique links I am hit with more duplicate ones.

  • I saw in more than 2 places the exact identical content. Does it it matter? If the original content is edited with new words, para or addition or deletion of content, still would be treated as duplicate? As identical issues/subject may have similar content.How Google audits such content at what rate?

  • Pingback: Съвети за успешен афилиейт маркетинг « SEO Блогът на Lilacor()

  • Thank your article is very helpful. I’ve noticed plenty of members of PLR organizations that simply post up articles not realizing that this could cause duplication issues. Keep up the good work!