When a site is whacked by the Google's algorithmic update, the affected webmaster tries to get answers from Google, but is often unable to get a detailed response, that too on a public forum. But recently, the head of Google Webspam team, Matt Cutts responded to a complaint not once but quite a few times.

The complaint in question was posted at Hacker News, where Matt responded to the queries several times. What the responses contain is not the focus here, rather the fact that Matt Cutts came back to the discussion so many times is what has generated interest.

We have listed these comments from Matt Cutts as an insight in dealing with similar problems faced by webmasters.

Understanding The Penalization:

“I'm not saying that the issues aren't fixable. But when the site came up for a review, we saw things that violate our quality guidelines: autogenerated pages with hundreds of affiliate links that consist of lots of keywords, and the links/keywords are duplicate content. If he fixes the issues by e.g. blocking out the autogenerated pages, then I expect his homepage will be found in Google. The autogenerated pages are also less useful to someone landing on a page because the pages quickly get stale: prices change, things go in/out of stock, etc.”

“You have an autogenerated web site that consists of practically nothing other than affiliate links to Amazon. You can make an infinite number of autogenerated pages on your site, e.g. http://www.filleritem.com/index.html?q=hacker+news http://www.filleritem.com/index.html?q=31.69 http://www.filleritem.com/index.html?q=teen+sex and each autogenerated page consists of literally hundreds of affiliate links stuffed with keywords for unrelated products.”

About the 1:1 Support Over Email:

“We've also been experimenting with 1:1 support over email, by way of a link in our webmaster console. The tension there is finding a solution that scales. We do try to keep an eye on tweets, blog posts, Google+, Hacker News, and similar places around the web, but that's also hard to scale.”

Manual vs Algorithmic Spamfighting:

“The site was flagged both algorithmically and also escalated to a member of the manual webspam team. The basic philosophy is to do as much as we can algorithmically, but there will always be a residual of hard cases that computers might not do as well at (e.g. spotting hacked sites and identifying the parts of a site that have been hacked). That's where the manual webspam team really adds a lot of value.

In addition to things like removing sites, the data from the manual webspam team is also used to train the next generations of our algorithms. For example, the hacked site data that our manual team produced not only helped webmasters directly, we also used that data to produce an automatic hacked site detector.

If you're interested, I made a video about the interaction between algorithmic and manual spamfighting here:

For more information on what Matt Cutts said, follow the thread at HackerNews.

Author

Navneet Kaushal is the Editor-in-Chief of PageTraffic Buzz. A leading search strategist, Navneet helps clients maintain an edge in search engines and the online media. Navneet is also the CEO of SEO Services company PageTraffic which is one of the leading search marketing company in Asia.