Posted June 11, 2007 6:28 pm by with 9 comments

Tweet about this on TwitterShare on LinkedInShare on Google+Share on FacebookBuffer this page

No, the headline isn’t a joke. Google okayed hidden text–in certain contexts, of course.

Remember a couple weeks ago when Search Engine Land was outed for spamming? Several readers, in an effort to be helpful (I’m sure), pointed out that this particular trick was the Fehrer Image Replacement technique.

Despite what some commenters seemed to believe, most SEOs with CSS experience actually do know what the Fehrer Image Replacement technique is. I’ve used it before. However, by the strictest definition of “search engine spam,” it would, unfortunately, be considered spam. Showing a search engine something different than what you show your users is, by definition, not an “approved” technique.

Danny Sullivan (you know, Editor-in-Chief of Search Engine Land), I think, knows a thing or two about what search engines find acceptable, being pretty much the foremost authority on search for over a decade. In fact, what he said in response was, “We were totally hiding text and technically might be considered spamming the search engines.” He vowed to correct it after things calmed down from SMX.

Good news, Danny. You don’t have to worry about it. Barry Schwartz, writing on SERoundtable, points out a Google Groups thread on this very issue. Susan Moskwa, Googler (okay, part of the Webmaster Central Google Groups support team), replies:

If your intent is purely to improve the visual user experience (e.g. by replacing some text with a fancier image of that same text), you don’t need to worry.

Barry also notes, however, that Matt Cutts says this does bring you closer to the “gray area.” So, note, if you’re using Fehrer & its ilk, other marginal activities on your site might make the Googlebot mad…

  • Isn’t it true that sites that are banned are reviewed manually first?

  • How about blogs that have dark text and background colors but a white “sheet” between them that takes a bit more time to load? (e.g. my blog – this viewed as hidden text by the bots? Thanks!

  • They also allow cloaking when using their Website Optimizer. There is a gray area in this as well, but as long as you are using these such tactics properly and not maliciously, you should not be worried about being banned/penalized by Google.

  • Hmm .. this has always been a bit of a grey area. What about text that’s hidden in one browser but shown in another? The whole idea of optimising your content for different channels implies that certain parts of the page may or may not be visible. This is perfectly legimitate and not for GoogleBot to decide.

  • Jordan McCollum

    GoogleBot and Google Inc. can certainly decide how and when they think they’re being abused. Google isn’t presuming to dictate ethics or morality; they have every right to dictate how they’ll conduct their own indexing and business.

  • I think the issue with hidden text comes down to intent. There are a lot of good uses for hidden text like dynamic menus that are clearly not implemented to deceive a search engine or manipulate rankings. And there are of course times when hidden text is used to manipulate things like keyword stuffing with text color to match background color.

    I think it’s good Google is letting it be known that it’s the intent that’s the issue and not simply the use of hidden text.

  • Pingback: make your online money » Blog Archive » Linkup 2007/06/12 — Algorithms, SEO, and Social Networks Oh My…()

  • Pingback: Friday Favorites - 06/15/07 | WebGeek()

  • Tamus Royce

    I’m just switching search engines from google to (founded by ex-google employees).

    It is up to google what to query within webpages. If posting the most ads and such gets google money, I’m glad. I’ll use google’s other things like free wireless or fiberoptics (if it ever comes around here).

    But a good search engine should only query non-hidden things. html and xml both have techniques to tag certain search items within pages.

    It takes a little effort and a bit more coding on the web designers part, but isn’t that the point on preventing spam sites?