Here are some raw notes from todayâ€™s session on spam.
Starts off reconfirming Yahooâ€™s commitment to search and personalization. He wouldnâ€™t be there talking, if they were not.
Spam is really about intent in using specific techniques; itâ€™s not about which techniques you are using. If I use five keyword tags, is that spamming? Itâ€™s really about are you helping the user. You can probably do a lot of different things, but are you helping the user. There are a lot of legitimate uses for every technique. Example – IP cloaking help with geographic locations.
Where is the fine line? If youâ€™re doing optimization, you should be appropriate to your industry, you shouldnâ€™t stick out like a sore thumb. They have internal tool to flag any result that sticks out. They also introduced spam report via Site Explorer. The majority of these spam reports are actually spam â€“ about 70% of reports are legitimate spam reports, the remainder are â€œnoisy.â€
Suggests asking your SEO friends to review any site that is penalized in Yahoo. Get their input, clean up the site, and then submit the re-inclusion request.
Peter Linsley, Senior Product Manager for Search, Ask.com
The definition of the penalty box. Doesnâ€™t want user to query cuddly kittens and get Viagra. Looking for things that hurt the user experience, not just spam â€“ such as pages with little content. Steps taken include lower rankings or removed completely from the index.
Cloaking with malicious intent, hidden text, link farms, keyword stuffing, poor user experience, empty pages (no 404) can all lead to your site being in the penalty box.
Donâ€™t let spammers leverage your site â€“ e.g., moderate your comments on your blog. Suggest adding captcha to your blog comments.
Aaswath Raman, Program Manager, Live Search, Microsoft
Quick overview of guidelines, similar to other search engines. Looking for pages for search engines and not user experience. Gives examples of things that can trigger investigation; suspicious links, affiliate sites without original content.
They look at things on the page level, your inbound links, and your outbound links, deceptive types of cloaking and redirects â€“ leading to pages with different content.
Any blatant attempt to game their rankings â€“ sites with no content or not valuable â€“ the site may be delisted/black listed.
Jokes, â€œif youâ€™ve left 10,000 comments in a few minutes, you might be a spammer.â€
Explains general spam policies, filtering, and manual removal â€“ more of a review of the basics. Re-confirms their attempts to try and contact webmasters, when there is an issue. Theyâ€™ll even send out emails in as many as 10 different languages â€“ theyâ€™ll explain what they did wrong (hidden text etc). If itâ€™s obvious that youâ€™re a professional spammer, theyâ€™ll likely not take the time to contact you. If youâ€™re mom and pop that â€œbacked intoâ€ spam, youâ€™re more likely to hear from Google.
Announces theyâ€™ve beefed-up the webmaster guidelines â€“ see for yourself here â€“ to provide concrete examples of things that are out of bounds. These examples donâ€™t automatically assume youâ€™re a bad guy, they accept that some webmasters just make simple mistakes. New guidelines include:
- Avoid hidden text or hidden links.
- Don’t use cloaking or sneaky redirects.
- Don’t send automated queries to Google.
- Don’t load pages with irrelevant keywords.
- Don’t create multiple pages, subdomains, or domains with substantially duplicate content.
- Don’t create pages that install viruses, trojans, or other badware.
- Avoid “doorway” pages created just for search engines, or other “cookie cutter” approaches such as affiliate programs with little or no original content.
- If your site participates in an affiliate program, make sure that your site adds value. Provide unique and relevant content that gives users a reason to visit your site first.
What percentage of time are you manually penalizing sites?
Cutts â€“ predominately uses algorithms, but there are very scalable ways to use people to monitor spam and theyâ€™ve been looking at that more recently.
Mayer â€“ uses both algorithm and human approaches to monitor spam.