Posted June 5, 2007 2:04 pm by with 1 comment

Tweet about this on TwitterShare on LinkedInShare on Google+Share on FacebookBuffer this page

Here are some raw notes from today’s session on spam.

Tim Mayer, VP of Product Management, Yahoo! Search

Starts off reconfirming Yahoo’s commitment to search and personalization. He wouldn’t be there talking, if they were not.

Spam is really about intent in using specific techniques; it’s not about which techniques you are using. If I use five keyword tags, is that spamming? It’s really about are you helping the user. You can probably do a lot of different things, but are you helping the user. There are a lot of legitimate uses for every technique. Example – IP cloaking help with geographic locations.

Where is the fine line? If you’re doing optimization, you should be appropriate to your industry, you shouldn’t stick out like a sore thumb. They have internal tool to flag any result that sticks out. They also introduced spam report via Site Explorer. The majority of these spam reports are actually spam – about 70% of reports are legitimate spam reports, the remainder are “noisy.”

Suggests asking your SEO friends to review any site that is penalized in Yahoo. Get their input, clean up the site, and then submit the re-inclusion request.

Peter Linsley, Senior Product Manager for Search,

The definition of the penalty box. Doesn’t want user to query cuddly kittens and get Viagra. Looking for things that hurt the user experience, not just spam – such as pages with little content. Steps taken include lower rankings or removed completely from the index.

Cloaking with malicious intent, hidden text, link farms, keyword stuffing, poor user experience, empty pages (no 404) can all lead to your site being in the penalty box.

Don’t let spammers leverage your site – e.g., moderate your comments on your blog. Suggest adding captcha to your blog comments.

Aaswath Raman, Program Manager, Live Search, Microsoft

Quick overview of guidelines, similar to other search engines. Looking for pages for search engines and not user experience. Gives examples of things that can trigger investigation; suspicious links, affiliate sites without original content.

They look at things on the page level, your inbound links, and your outbound links, deceptive types of cloaking and redirects – leading to pages with different content.

Any blatant attempt to game their rankings – sites with no content or not valuable – the site may be delisted/black listed.

Matt Cutts, Software Engineer, Google, Inc.

Jokes, “if you’ve left 10,000 comments in a few minutes, you might be a spammer.”

Explains general spam policies, filtering, and manual removal – more of a review of the basics. Re-confirms their attempts to try and contact webmasters, when there is an issue. They’ll even send out emails in as many as 10 different languages – they’ll explain what they did wrong (hidden text etc). If it’s obvious that you’re a professional spammer, they’ll likely not take the time to contact you. If you’re mom and pop that “backed into” spam, you’re more likely to hear from Google.

Announces they’ve beefed-up the webmaster guidelines – see for yourself here – to provide concrete examples of things that are out of bounds. These examples don’t automatically assume you’re a bad guy, they accept that some webmasters just make simple mistakes. New guidelines include:


What percentage of time are you manually penalizing sites?

Cutts – predominately uses algorithms, but there are very scalable ways to use people to monitor spam and they’ve been looking at that more recently.
Mayer – uses both algorithm and human approaches to monitor spam.

  • dnn

    Thanks for covering pb summit.