Seems like there is a big problem with Yahoo!’s crawler. Reports coming from Search Engine Watch Forums and  WebmasterWorld Forums indicate that Yahoo! Slurp is indexing pages which should not be indexed. It might be that some specific bots are not conforming to robots.txt file. The pages are being indexed at rates which can be very harmful. A member of  Search Engine Watch Forums has posted an example:

“Host: 72.30.216.22
/suspended.page/
Http Code: 404 Date: Jan 17 01:12:31 Http Version: HTTP/1.0 Size in Bytes: –
Referer: –
Agent: Mozilla/5.0 (compatible; Yahoo! Slurp; http://help.yahoo.com/help/us/ysearch/slurp)

Host: 74.6.67.78
/suspended.page/
Http Code: 404 Date: Jan 17 01:11:24 Http Version: HTTP/1.0 Size in Bytes: –
Referer: –
Agent: Mozilla/5.0 (compatible; Yahoo! Slurp; http://help.yahoo.com/help/us/ysearch/slurp)

Host: 74.6.74.155
/suspended.page/
Http Code: 404 Date: Jan 17 01:08:45 Http Version: HTTP/1.0 Size in Bytes: –
Referer: –
Agent: Mozilla/5.0 (compatible; Yahoo! Slurp; http://help.yahoo.com/help/us/ysearch/slurp)

Host: 74.6.71.43
/suspended.page/”

Author

Navneet Kaushal is the Editor-in-Chief of PageTraffic Buzz. A leading search strategist, Navneet helps clients maintain an edge in search engines and the online media. Navneet is also the CEO of SEO Services company PageTraffic which is one of the leading search marketing company in Asia.