"How do they tell if they have bad results?
A: They have a bunch of watchdog services that track uptime for various servers to make sure a bad one isn't causing problems. In addition, they have 10,000 human evaluators who are always manually checking teh relevance of various results."
Over at Webmaster World, kamikaze Optimizer rationalizes:
"Yeah, I remember the past discussions and the leaked document, but 10,000 is a huge number.
Think of it this way – out of 10,000, 1% of them would be talking about it on blogs and the like…, and since the leaked document (what was that, 2-3 years ago?) we have heard nothing, correct?
I have seen Google IP's hit my sites (Corp IP's that is, not bots).
I blocked one by once for using a bad user agent and the employee emailed me, it was all very funny.
Odd and interesting, I think, about the 10,000 number.
It is clearly huge investment in quality control, if it is done correctly. "
Yes, the number was given as 10,000 but the massive total does not tell much on whether all the human evaluators are full time or part-time or even outsourced.