Marketing Pilgrim's “SEO” Channel

Sponsor Marketing Pilgrim's SEO Channel today! Get in front of some of the most influential readers in the Internet and social media marketing industry. Contact us today!

Does Google Algo Update Really Target Entire Content Farms?



Sometimes a farm can just have a bad year. One field doesn’t produce like it has in the past so the quality of one crop suffers while the rest of the farm does fine. Google’s latest update (which Search Engine Land’s Danny Sullivan is trying to call the “Farmer” update but I am going with the “Discontent” update) is really saying to everyone that general low quality content, no matter where it comes from, is going to get whacked.

Google’s official blog says

Our goal is simple: to give people the most relevant answers to their queries as quickly as possible. This requires constant tuning of our algorithms, as new content—both good and bad—comes online all the time.

Many of the changes we make are so subtle that very few people notice them. But in the last day or so we launched a pretty big algorithmic improvement to our ranking—a change that noticeably impacts 11.8% of our queries—and we wanted to let people know what’s going on. This update is designed to reduce rankings for low-quality sites—sites which are low-value add for users, copy content from other websites or sites that are just not very useful. At the same time, it will provide better rankings for high-quality sites—sites with original content and information such as research, in-depth reports, thoughtful analysis and so on.

The ‘Personal Blocklist for Chrome” extension has played no part in this, according to Google, and the impact will only be felt in the US for now (which is where most of complaining probably was coming from anyway because that’s what we do, like it or not.

Over at Search Engine Land Sullivan points out how prime content farm target number one Demand Media’s CEO Richard Rosenblatt got very defensive about Demand getting the ‘stupid content farm’ label. If Mr. Roenblatt is so upset by this implication then my suggestion is to simply check the quality of your content before you talk anymore. The example that is given at SEL is both pathetic and comical regarding a ranked ‘article’ from Demand’s eHow site about how to get pregnant fast. Check it out, you’ll chuckle.

The SEL post goes on to say

Rosenblatt is right that Demand Media properties like eHow are not necessarily content farms, because they do have some deep and high quality content. However, they clearly also have some shallow and low quality content.

That content is what the algorithm change is going after. Google wouldn’t confirm it was targeting content farms, but (Matt) Cutts did say again it was going after shallow and low quality content. And since content farms do produce plenty of that — along with good quality content — they’re being targeted here. If they have lots of good content, and that good content is responsible for the majority of their traffic and revenues, they’ll be fine. In not, they should be worried.

This is why I think this update is trying to do really what Google has claimed it does all along but has yet to deliver on which is to simply weed out bad content from the good. Google likes to see themed sites and thus supposedly rewards those overall sites for a consistent message that is within a certain silo of information. What they have done, though, is take eHow articles and other content of questionable quality and rewarded it even though the only theme in sites like that are silos of information occurring in one spot of the overall site (like About.com). These ‘catch all ‘ content sites aren’t themed on particular area thus they have never fit the true Google site profile for quality. Despite that they get rankings or at least they did.

So will Google now be able to go into a site a find only the good quality content while ignoring the bad? What will be the percentage balance that tips the scales for Google to deem a site as junk despite a mix of good and bad content? What will the collateral damage be if they have chosen specific targets but will just let the algorithm make the deicsion? That sounds a bit against their usual way of doing things and, while I couldn’t tweak an algorithm if my life depended on it, I would have to think that it would get pretty difficult to adjust full site ranking signals to ID individual pieces of quality content within a site then reward just those quality entries. Will it be a crop that gets targeted or will entire farms get shut out? With just about 12% of results being affected that still seems pretty low consider how awful some SERP’s can be when it comes to quality content returns.

In the end though maybe this is a moot point anyway. Any SEO worth their salt will tell you that while great content is necessary it really is mostly so for the reader. SEO’s know that if they were given two sites, one with great content but only the ability to attract links organically, while the other site has content quality that barely meets English language quality standards but can have a boatload of links pointing to it from an SEO’s efforts, the link heavy craptent will win most every time.

This might be the real problem that Google has. Maybe their heavy dependence on links as a signal of quality is simply being gamed. SEO’s have found what works to rank a site despite the quality of the content and it will be interesting to see if this update truly impacts this attempt on a “Google wide” basis. Sure, the engine has come down on JC Penney and Overstock but they needed to have outside recognition of the problems then some pressure to change it. The algorithm didn’t ID the Overstock overindulgence in .edu links, an Overstock competitor did. JC Penney didn’t get whacked until the New York Times wrote about it.

Maybe the market is ready for an SEOleaks site where someone reveals the dirty little SEO secrets of sites that have garnered success in Google for all the wrong reasons? Maybe Google needs to hire a spy network of SEO’s to do their bidding for them and get to the real root of their problem rather than depending on Uncle Algo to do it all? Can you imagine two SEO sleuths ‘interviewing a site for information and playing “Good Crop, Bad Crop”? Now that would be interesting.

I have said this on numerous occasions (and I am not unique I realize) but I think that Google’s biggest threat to continued success is their dependence on technology to do everything and their minimizing of human interaction and human work to tidy up the edges. As with any piece of fabric, if the edges are frayed eventually the whole fabric will fall apart. Could that be Google’s fate? Is this whole algorithm mystique more show than go? Look at business history. Every deal (business) on the planet is 90 days away from extinction, I have been told. Why should Google be any different?

Since hiring bodies would plow into profits will Google ever consider more human fire power that is not pointed toward building the perfect technology but rather dealing with the real world results of that technology? They should, in my opinion, but I suspect they won’t because despite all of the good things Google does, they are supremely arrogant. I hear that pride goeth before the fall, what about you?

So what are your thoughts on this entire issue? Has Google really slipped and is trying to regain its footing before it finds itself fully on the slippery slope with only trouble ahead? Is this much ado about nothing and just a PR move by Google? In the end, is it really possible to tell the true difference between good content and bad if there are other mitigating factors like links that can offset those signals?

Give us your take on this because you are the experts.

  • http://www.marketingpilgrim.com Andy Beal

    To adapt from a classic quote from the movie The Usual Suspects…

    “The greatest trick Google ever played was convincing the world that spam problems didn’t exist.”

    Really, if Google is so adept at detecting spam and content farms, why are their so many different ways to submit such to Google?

    If Google can figure out paid links, why did it invent the nofollow tag?

    These issues have faced Google for many years, it’s only now that that the company has reached the size of Microsoft, that the media and politicians are licking their chops and reaching for the rubber gloves! ;-)

    • http://www.stanleyoppenheimer.com searchengineman

      Unless Google is going to put less emphasis on links, (Not likely)
      or hire Watson (IBM) to Curate the entire body of literature and compare the quality of the content.

      They have no choice but to hire humans.

      Searchengineman

      • http://www.ClickWebDesign.com.au Chris – ClickWebDesign

        The only humans that will be hired are engineers to tweek the algo:) Seriously trying to manually index the web?

        The real question is how has google achieved this? 30-40% spun articles were the norm. Will 50-60% be needed now? Is there improved spelling and grammar checks? Maybe checking the unique words on a page to indicate depth?

        One things for sure, the SEO guys that have fallen from the first page of Google will be trying to figure it out!

  • http://kercommunications.com Nick from Ker Communications

    As an internet marketing guy (note the avoidance of “SEO”), I am all for a SEOLeaks type of site. There are just so many “ethical and organic” SEOs whose own sites rank solely from paid links and other schemes. The problem with a whistleblower site is that it would be full of false accusations, “framing” of innocents, and who knows what other kind of nonsense. Shady SEOs may be unethical, but they are sometimes resourceful.
    Cheers to Google for trying to continue to provide quality, relevant search results. They have to or else they will become irrelevant themselves.

  • http://www.johnakerson.com John Akerson

    You asked a great question “will Google ever consider more human fire power that is not pointed toward building the perfect technology but rather dealing with the real world results of that technology?”

    I don’t think that is the answer because an algorithm is more scalable, and also more defendable. Think about it this way – if there is a problem, and J.C. Penny gets knocked down in Google because of human intervention… Google is to blame. If it is just a mathematical formula, there’s no real blame to that… From another perspective, if Google hired real people to do the tinkering, those people would have to all be multi-millionairres because if they weren’t there would be ENORMOUS potential for SEO-corruption. Imagine a company offering to pay one of these people, say $50k to lower their competitors rankings… I Think there are a number of other reasons, but ultimately, an algorithm is the way to go. (whether it is an arrogant algorithm or not is another question) :-)

    • http://www.frankthinking.com Frank Reed

      I see your point John.

      Scale at the expense of quality though is a bad choice. I don’t think that an algorithm can ever TRULY determine the quality of content because TRUE quality is very subjective. It can pick out something that is non-sensical, of course, but being a true judge of quality just isn’t reality.

      As for the defense of the algorithm? That’s a cop out. Why should we let Google just say “The algorithm made me do it!”? That algorithm was programmed by people so ultimately this all comes down to flesh and blood.

      As for JC Penney and Overstock, both of those cases only existed because of human intervention. The algorithm caught nothing as far as I can tell (anyone at Google want to tell us differently? Please do!).

      It’s a philosophical and business argument that has no real winners only losers and in the end we’ll all just have to live with whatever Google gives us until something better comes along.

  • http://www.jaankanellis.com Jaan Kanellis

    Here is the problem…why did it take NYT times outing JC Penny for Google to find these links. They truly have a problem detecting spam algorithmically. Also they have 100′s of reports of real spam within their own forums that go completely ignored. Why? Here is the free “manpower” they are looking for!

    • http://www.irishwonder.com IrishWonder

      Actually, as of lately Google has been much more responsive towards spam reports

  • http://www.seo-theory.com/ Michael Martinez

    “These ‘catch all ‘ content sites aren’t themed on particular area thus they have never fit the true Google site profile for quality. Despite that they get rankings or at least they did.”

    WHOA! Whoa! Whoa! there.

    At no time in the history of the Earth has Google ever declared that THEMED SITES are what they profile for “quality”. Good gravy, by that definition, Marketing Pilgrim is pure CRAP and CNN.COM can just kiss its sweet behind good-bye if Google cleans up the Web like this.

    Theming, siloing, focusing — whatever you want to call it — NO WEB SITE has to do this. Ever. That is NOT what separates “content farms” from the rest of the Web.

    • http://www.frankthinking.com Frank Reed

      @Michael – Touch a nerve, did I?

      My question to you then is why does Google refer to removing SITES from SERPs and not just particular pages from sites? All of this talk of content farms comes around to Google talking about they rank sites when they are really just ranking pages within a site. Sites can have both good and bad content (We laugh at ourselves as being an example of both!). Most sites outside of news etc are themed, though, because they are about a business etc coloring outside those lines confuses the purpose of the site.

      So what is Google ranking? A page of good content or a site? How do they determine if a page is good? Can it still be good content if the rest of the site stinks?

      Just wondering. Unlike most people, I do not claim expertise but rather I see myself as a life long learner in this space thus being called out as wrong is fine because last time I checked I am not perfect.

  • http://www.essentialservicesinc.com/ shp

    I like the way google does for the ranking, it can really provide quality sites to users. Nobody really wants to find some junk sites every time they search for something.

  • Brian Rubin

    “Maybe the market is ready for an SEOleaks site where someone reveals the dirty little SEO secrets of sites that have garnered success in Google for all the wrong reasons?”

    OMG, I would totally love this and be 100% behind it. I also agree that Google should employ human SEO folks to weed out bad content farms and other problematic sites.

    • http://www.irishwonder.com IrishWonder

      Somebody has obviously thought of it already – SEOLeaks.com, .net and .org are all registered back in December

  • http://www.seo-theory.com/ Michael Martinez

    @Frank: “My question to you then is why does Google refer to removing SITES from SERPs and not just particular pages from sites?”

    I’m sure you’ve had opportunity over the past 2 days to see some of the reports I have seen but let me point out that many of these sites are only losing a few positions on some groups of keywords. I’ve looked at some friends’ sites and they are showing me a shotgun effect. A common report is that sites have lost 30-40% Google referrals.

    So they are not exactly wiping ALL affected sites from the SERPs.

  • chris

    Hmm… How does Google decide what is low quality content? You may think a site is low quality, but another may find it very helpful. The article on get pregnant fast may seem funny or non informative to one person, but someone else may like it. An open internet is the way to go when people start saying what can and can’t be seen is when we have problems. I think that most of are smart enough to decide for ourselves what content we want to see or not see, you can always LEAVE the page! I don’t need my search engine telling me what content that I can see or can’t see.

    Do you guys even know who Google is playing for? Do some research the name George Soros comes up. If you don’t know who that is you must be living under a rock. Google is playing god and trying to decide what is good for you and me. Google makes money on its ads and advertising you can bet that it ranks certain sites higher for its own profit and benefit.