Jan 18, 2007 113 reads by Navneet Kaushal

Seems like there is a big problem with Yahoo!'s crawler. Reports coming from Search Engine Watch Forums and  WebmasterWorld Forums indicate that Yahoo! Slurp is indexing pages which should not be indexed. It might be that some specific bots are not conforming to robots.txt file. The pages are being indexed at rates which can be very harmful. A member of  Search Engine Watch Forums has posted an example:

“Host: 72.30.216.22
/suspended.page/
Http Code: 404 Date: Jan 17 01:12:31 Http Version: HTTP/1.0 Size in Bytes: –
Referer: –
Agent: Mozilla/5.0 (compatible; Yahoo! Slurp; http://help.yahoo.com/help/us/ysearch/slurp)

Host: 74.6.67.78
/suspended.page/
Http Code: 404 Date: Jan 17 01:11:24 Http Version: HTTP/1.0 Size in Bytes: –
Referer: –
Agent: Mozilla/5.0 (compatible; Yahoo! Slurp; http://help.yahoo.com/help/us/ysearch/slurp)

Host: 74.6.74.155
/suspended.page/
Http Code: 404 Date: Jan 17 01:08:45 Http Version: HTTP/1.0 Size in Bytes: –
Referer: –
Agent: Mozilla/5.0 (compatible; Yahoo! Slurp; http://help.yahoo.com/help/us/ysearch/slurp)

Host: 74.6.71.43
/suspended.page/”

Navneet Kaushal

Navneet Kaushal

Navneet Kaushal is the founder and CEO of PageTraffic, an SEO Agency in India with offices in Chicago, Mumbai and London. A leading search strategist, Navneet helps clients maintain an edge in search engines and the online media. Navneet's expertise has established PageTraffic as one of the most awarded and successful search marketing agencies.
Navneet Kaushal
Navneet Kaushal
FOLLOW US AND GET LATEST NEWS
125,891
Most popular Posts
Tweets
Upcoming Events
Events are coming soon, stay tuned!More