Posted July 9, 2008 4:26 pm by with 27 comments

Tweet about this on TwitterShare on LinkedInShare on Google+Share on FacebookBuffer this page

Today, Google Fellow Amit Singhal posts on the Official Google blog explaining Google’s ranking system. But before you all start salivating and clicking, note that this is a very “high level” overview—no secrets given away. Still, it’s a good reminder of the basics of Google’s ranking system.

Singhal writes that Google’s ranking system is based on three guiding principles:

  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  1. Best locally relevant results served globally.
  2. Keep it simple. [Thanks for not calling us stupid.]
  3. No manual intervention.

Now for the breakdown: [musical break, awesome choreography]

Ahem. Now for the interpretation:

Best locally relevant results served globally.
Despite the fact that Singhal says this one is “obvious,” to me this is the most oblique of all of the statements here. But yes, apparently he does mean that the best “local” result should be the best overall result:

We often call this the “no query left behind” principle. Whenever we return less than ideal results for any query in any language in any country – and we do (search is by no means a solved problem) – we use that as an inspiration for future improvements.

‘Kay, maybe I’m dense, maybe I’m just too bounded by the denotations of the words that he’s using, but . . . what? What he’s saying here sounds like “When queries don’t return good results, we want to improve.” That’s awesome. But what does that have to do with “local” and global results? Or does he mean that the best “local” result (results served in other languages/countries, not we would actually consider “local searches,” I guess) should be as good as the best overall result? That raises a slew of other questions.

Keep it simple.
The first principle “is obvious” and this one “seems obvious.” (I think I’m in for a headache, especially when he asks “Isn’t it the desire of all system architects to keep their systems simple?” Oh, if only.)

No, really, this is simple:

We work very hard to keep our system simple without compromising on the quality of results. . . . We make about ten ranking changes every week and simplicity is a big consideration in launching every change. Our engineers understand exactly why a page was ranked the way it was for a given query.

But no, they won’t tell you if you corner them at a search conference.

No manual intervention.
Singhal gives two reasons for this: first, that any one individual is too subjective to render good, objective results and second:

often a broken query is just a symptom of a potential improvement to be made to our ranking algorithm. Improving the underlying algorithm not only improves that one query, it improves an entire class of queries, and often for all languages.

This does come with a caveat:

I should add, however, that there are clear written policies for websites recommended by Google, and we do take action on sites that are in violation of our policies or for a small number of other reasons (e.g. legal requirements, child porn, viruses/malware, etc).

Singhal promises more fun in a future post, promising to “discuss in detail the technologies behind our ranking and show examples of several state-of-the-art ranking techniques in action.” Excellent.

  • Blogs with RSS got better ranking in Google anyway.

    Symbian’s last blog post..Nokia S60 Symbian Windows Live client available in Europe

  • Jordan – thanks for the interpretation–which Google always seems to need.

  • Jordan McCollum

    @Andy—What, didn’t you like the dancing?

  • The Google post bugged me for a few reasons. I really think they are pretending to be transparent when in fact they are very secretive about ranking even though that hurts a lot of mom and pop websites (as well as large ones).

    As anybody with even basic SEO knowledge knows, the key challenges to a site ranking properly hae little to do with Amit’s three points. Fairly new websites that are fantastically informative generally rank very poorly, violating principles 1 and 2. The generaly reason is that Google is worried spam sites – which tend to be newer – will flood the results if time and lack of incoming links are not taken into account.

    Google feels the lack of transparency and ranking quirkiness is needed to avoid online chaos. I don’t agree. If Google really wants to protect their virtual monopoly on search they would join Web 2.0 sensibilities and follow a more transparent path, clearly identifying with hundreds of examples what they think are best practices sites and deceptive practices. Most importantly they’d talk about the gray areas in acceptable linking practices – an item that frustrates the efforts of most legitimate webmasters even if they don’t know it.

    Joe Hunkins’s last blog post..Google Ranking Needs a Spanking

  • I agree with Joe, this just makes for more confusion. Although I do like the last statement you quote about coming out with details of the technology. Just don’t know if what google describes as details is the same as what you or I would call details.

    Time will tell, but we’re never really gonna get anything substantial. Google can’t afford to let much out of the bag.

    Erik’s last blog post..Generating Content for Multiple Blogs

  • Google isn’t about to let us, shall we say ‘lesser humans’ really know how they give rank. Personally I think there’s someone standing in front of a dart board throwing darts and then what ever they land on, that’s the rank they give. What do you think cause that’s just my opinion.

    KeeKee’s last blog post..Hope you have a blast!!!

  • I guess that saves some money on buying Google for Dummies doesn’t it?!

  • I agree with Joe as well. But ya know in the end its in Google’s (and ourselves) best interest to keep their system under raps. Not, only to keep SEOs from gaming it, but also to keep businesses competitive. If we all knew exactly how to game Google then SEM would be a piece of cake and there would be very little competition between competing businesses – we would all do the same thing. Innovation would go out the window and the internet would be a tiny bit less interesting….Although we would still have LOLCATS!

    Joe Hall’s last blog post..Top 5 National Real Estate Franchise Web Sites

  • “we want to serve relevant results”. “create good content”. if i had a nickel for everytime i hear that stuff.

    Utah SEO Pro’s last blog post..Interview with SEO: Brian Carter and Search Engine Journal Post

  • Talk about writing your own ticket. If someone could figure out how the algorithm worked they’d be set for life. I’m sure they make sure their employees are employees for a very long time.

    Until then we’ll all just have to keep guessing.

  • They definitely modify the search results manually if they find someone doing aggressive SEO.. isn’t it ?

    Cars blog’s last blog post..Sony MEXR1 Car DVD Player Review

  • I don’t believe that they do the rank based on that guidance. The rank changing is oftenly unpredictable.

  • How very enlightening. Was this needed?

    Nicole Price’s last blog post..Hair Care Products: Discounts and Deals

  • Interpretazione interessante.. ora è da testare sui siti web..

  • Well, as usual, google is very vague. The title of the article got me awfully excited, but I’m afraid Singhal’s revelations let me down a bit.

    Lorien’s last blog post..I Miss You So Much Graphic

  • I think total transparancy would hurt more than it helps. The small Mom & Pops wouldn’t have a chance against bigger shops that would just have so much more resources to dominate. At least know Moms and Pops can go after longer-tail phrases / low hanging fruit while the bigger shops duke it for the big prize on more generalized terms.

    I honestly think some sort of rotating ranking system would be more accurate. Within our niche, there are a good 30 – 50 sites that have good info. But only 10 really get featured because they get the 1st page rankings. I know many will think that wreaks of being unfair, but if you really want the Moms and Pops to be on a level playing field, I think that would do it.

  • As cryptic as ever, but who can blame them . . . they do hold the secret to life the universe and everything. Seriously though, I have seen the locally adjusted SERP results in action. Being that I’m in Canada, I often see Canadian based websites show up near the top, that wouldn’t even make the first page when queried from the US.

    Top Rated Digital’s last blog post..The Top 10 Digital Cameras Ripped Apart

  • Jordan McCollum

    @TRD—Isn’t that the exact opposite of what Singhal said here? He said the best local result should be shown in all results (“Best locally relevant results served globally.”), and you’re seeing poor local results beating good global results.

    I don’t doubt that you’re right, but I don’t think that’s an example of the first principle actually “working.”

  • Google is just playing with us. There is so much inconsistency with them that it is super frustrating at times when one has to figure out what exactly is what.

    As far as the local search aspect is concerned, one can stipulate pages from your local zone, which will give only locally relevant returns only, or pages from the web, which will add the best international sites, along with the top locally ranked ones.

    Am based in South Africa, and it works pretty well for us, giving a spread of local results, combined with the top international results to compare against. The challenge is to get local stuff up to international spec, but not too much drek comes through.

  • I think you might be better off reading their patent information for a more accurate look at seo + their algo.

    Mr Marketing’s last blog post..Negative PR is still PR.

  • The truth of the matter is that Google are not going to let us know their secrets, and they will always ensure they are one step ahead of the SEO. That is how they are top for search results. Although I would love to work out their algo’s.

  • Well said Search Engine Optimisation, how it works will always be a mystery and how they say what they say will always be coded with a certain amount of truth.

  • PS3

    Yup, just like the recipe for KFC!

    The mystery will keep so called SEO “gurus” in jobs for a good while. From my layman;s view, it is a lot of trial and error!

  • Sites definitely rank in different spots between yahoo and google. Does anyone have a good explanation as to why or how to control it?

  • Singhal say No “manual intervention” But what is case of wikipedia, it`s a only son of google?

  • They will never spill the beans so easy.

    Travel Point’s last blog post..The Samba of Brazil

  • Good article. I do feel that we will never really find out and we’l just keep on link building, writing good content, meta data, keeping things fresh, designed to good standards and targeted to our respective audiences.