Our goal is simple: to give people the most relevant answers to their queries as quickly as possible. This requires constant tuning of our algorithms, as new content—both good and bad—comes online all the time.
Many of the changes we make are so subtle that very few people notice them. But in the last day or so we launched a pretty big algorithmic improvement to our ranking—a change that noticeably impacts 11.8% of our queries—and we wanted to let people know what’s going on. This update is designed to reduce rankings for low-quality sites—sites which are low-value add for users, copy content from other websites or sites that are just not very useful. At the same time, it will provide better rankings for high-quality sites—sites with original content and information such as research, in-depth reports, thoughtful analysis and so on.
The ‘Personal Blocklist for Chrome” extension has played no part in this, according to Google, and the impact will only be felt in the US for now (which is where most of complaining probably was coming from anyway because that’s what we do, like it or not.
Over at Search Engine Land Sullivan points out how prime content farm target number one Demand Media’s CEO Richard Rosenblatt got very defensive about Demand getting the ‘stupid content farm’ label. If Mr. Roenblatt is so upset by this implication then my suggestion is to simply check the quality of your content before you talk anymore. The example that is given at SEL is both pathetic and comical regarding a ranked ‘article’ from Demand’s eHow site about how to get pregnant fast. Check it out, you’ll chuckle.
The SEL post goes on to say
Rosenblatt is right that Demand Media properties like eHow are not necessarily content farms, because they do have some deep and high quality content. However, they clearly also have some shallow and low quality content.
That content is what the algorithm change is going after. Google wouldn’t confirm it was targeting content farms, but (Matt) Cutts did say again it was going after shallow and low quality content. And since content farms do produce plenty of that — along with good quality content — they’re being targeted here. If they have lots of good content, and that good content is responsible for the majority of their traffic and revenues, they’ll be fine. In not, they should be worried.
This is why I think this update is trying to do really what Google has claimed it does all along but has yet to deliver on which is to simply weed out bad content from the good. Google likes to see themed sites and thus supposedly rewards those overall sites for a consistent message that is within a certain silo of information. What they have done, though, is take eHow articles and other content of questionable quality and rewarded it even though the only theme in sites like that are silos of information occurring in one spot of the overall site (like About.com). These ‘catch all ‘ content sites aren’t themed on particular area thus they have never fit the true Google site profile for quality. Despite that they get rankings or at least they did.
So will Google now be able to go into a site a find only the good quality content while ignoring the bad? What will be the percentage balance that tips the scales for Google to deem a site as junk despite a mix of good and bad content? What will the collateral damage be if they have chosen specific targets but will just let the algorithm make the deicsion? That sounds a bit against their usual way of doing things and, while I couldn’t tweak an algorithm if my life depended on it, I would have to think that it would get pretty difficult to adjust full site ranking signals to ID individual pieces of quality content within a site then reward just those quality entries. Will it be a crop that gets targeted or will entire farms get shut out? With just about 12% of results being affected that still seems pretty low consider how awful some SERP’s can be when it comes to quality content returns.
In the end though maybe this is a moot point anyway. Any SEO worth their salt will tell you that while great content is necessary it really is mostly so for the reader. SEO’s know that if they were given two sites, one with great content but only the ability to attract links organically, while the other site has content quality that barely meets English language quality standards but can have a boatload of links pointing to it from an SEO’s efforts, the link heavy craptent will win most every time.
This might be the real problem that Google has. Maybe their heavy dependence on links as a signal of quality is simply being gamed. SEO’s have found what works to rank a site despite the quality of the content and it will be interesting to see if this update truly impacts this attempt on a “Google wide” basis. Sure, the engine has come down on JC Penney and Overstock but they needed to have outside recognition of the problems then some pressure to change it. The algorithm didn’t ID the Overstock overindulgence in .edu links, an Overstock competitor did. JC Penney didn’t get whacked until the New York Times wrote about it.
Maybe the market is ready for an SEOleaks site where someone reveals the dirty little SEO secrets of sites that have garnered success in Google for all the wrong reasons? Maybe Google needs to hire a spy network of SEO’s to do their bidding for them and get to the real root of their problem rather than depending on Uncle Algo to do it all? Can you imagine two SEO sleuths ‘interviewing a site for information and playing “Good Crop, Bad Crop”? Now that would be interesting.
I have said this on numerous occasions (and I am not unique I realize) but I think that Google’s biggest threat to continued success is their dependence on technology to do everything and their minimizing of human interaction and human work to tidy up the edges. As with any piece of fabric, if the edges are frayed eventually the whole fabric will fall apart. Could that be Google’s fate? Is this whole algorithm mystique more show than go? Look at business history. Every deal (business) on the planet is 90 days away from extinction, I have been told. Why should Google be any different?
Since hiring bodies would plow into profits will Google ever consider more human fire power that is not pointed toward building the perfect technology but rather dealing with the real world results of that technology? They should, in my opinion, but I suspect they won’t because despite all of the good things Google does, they are supremely arrogant. I hear that pride goeth before the fall, what about you?
So what are your thoughts on this entire issue? Has Google really slipped and is trying to regain its footing before it finds itself fully on the slippery slope with only trouble ahead? Is this much ado about nothing and just a PR move by Google? In the end, is it really possible to tell the true difference between good content and bad if there are other mitigating factors like links that can offset those signals?
Give us your take on this because you are the experts.