What Google’s Webspam Team Will Tackle in 2009?
Recently, Matt Cutts posted a question on his blog, asking about, what would you like to see the Google’s webspam team tackle in 2009. Posted 3 days ago, it already racked a whopping 192 comments, and probably is still counting.
It is actually very interesting to go through some of the comments and reading them, as these gives you an idea about what people are concerned about, when it comes to fighting spam. As I said, there too many comments, and so, I highlight here those that I liked the most.
- A better and faster way to report suspected links. Someone suggested for Google to develop a sort of a “report suspect toolbar” where you can just hit the button and wham, it gets reported to Google. Hmmm, not so sure about this. Spammers will most likely use it more for their benefit, by eliminating competition.
- Scraped content getting indexed on the scraper’s site before the author’s site.
- Scraped content appearing higher in the search listings than the original content.
- This one I like: A suggestion on including in the webmaster tools, issues found on our websites, such as, if we have been penalized, potential red flags, etc.. and suggestions to fix them.
- Refined results on the SERPs regardless of how old a website is. It some cases it appears that Google favors older sites than newer ones. However, it was commented that not all new sites are potential spammers as they actually have more up-to-date information, e.g., review sites.
- More focus on duplicate content (not for penalties, but for results). Specially non-English content. Nothing more to say here.
- More room for Quality Advertisers. Rumored that AdWords also being used for SPAM purposes.
- Severe penalties for Domains being proven guilty of no, no’s. Instead of just a specific service, ban everything that originates from the same domain, e.g., gmail, webmaster tools, adsense, adwords, the works… I have to admit that this can be difficult, but the author of the comment did mention “innocent until proven guilty”.
- Search results pointing to pages, that provide more search results. Sort of a domino effect.
- Sites ranking highly on Google due to paid text links.
- After PageRank, why not SpamRank🙂
- Google should devalue all links coming from myspace, facebook, etc.
- Using humans to evaluate reported suspected sites.
- A GPS meta-tag to locate business or organization on a specific location:)
- A short and straight to the point comment: shut down all splogs on blogger.
And lots more. Go visit Matt Cutts’s blog and see for yourself. Maybe you can give some suggestion that isn’t there yet. I am sure Matt will love it.
Suscbribe my feed for updates:RSS Feed
Follow-me on: Twitter