Yahoo! Slurp gets Banned by Webmasters! 'Pros & Cons' Debate Rages On!

Jul 1, 2008 | 3,149 views | by Navneet Kaushal
VN:F [1.9.20_1166]
Rating: 0.0/5 (0 votes cast)

According to Webmaster World , numerous Webmasters, after bearing inconsistent Yahoo! crawl rate and odd site traffic, have finally decided to take the ultimate step and are banning Yahoo! Slurp altogether from crawling their websites.

Many Webmasters believe that, Yahoo! might take action against such Webmasters, by marking their websites as 'harmful' via the Yahoo! SearchScan .

If you too are looking for ways to ban Yahoo! Slurp altogether, here is how you can achieve that:

http://help.yahoo.com/l/us/yahoo/search/webcrawler/slurp-02.html

Here are some interesting posts from the thread at Webmaster World :

“Hi,
My site gets new content once a month but yahoo bot for no apparent reason crawls my site everyday and crawls most of my site.

How do i stop and save valuable bandwidth ?
If it is possible via robot what is the syntax “

“Putting this in robots.txt should at least slow it down:

User-agent: Slurp
Crawl-delay: 240”

“Can't say how much I hate that bot, keep meaning to block it, but always hung out on well you never know.

Just done a search of the log files for a small site.
Number of requests 34,000+ number of visitors 139.

I give in, slurp your banned.”

“Unfortunately Yahoo are pretty useless (though not as useless as Microsoft).

Yahoo sends a lot of separate bots from different IPs, occasionally violates robots.txt instructions, and insists on requesting the index of any directory even when there are no links to it and the index option has been turned off on the server.

You might try increasing the number of seconds in the crawl delay to 2400 or whatever.

You can also ban Slurp China specifically if you don't cater to the Asian market.

I would discourage banning Yahoo altogether because Google needs competition, however inept, and because Yahoo will send at least some traffic – and all human visitors should be valued.

Webmasters have that choice, though.”

“Two more points I would add to this discussion:

Firstly, do not expect an instant response to any change in your robots.txt – the bots will be working from a cached version and may take a few days to update themselves.

My second point is theoretical and cannot be treated as proven.

Yahoo's SearchScan feature was recently introduced in partnership with anti-virus vendors McAfee. It rates "site safety" in a way that has some similarities with the notorious AVG LinkScanner.

SearchScan is related to McAfee SiteAdvisor, but your logs will never identify a hit from McAfee.

Shortly before Yahoo SearchScan was launched a new (actually revived) Slurp spider was also launched. There are so many Yahoo bots that conclusions are difficult, but McAfee has to get its data from somewhere, and that means fetching pages from your site.

Banning Slurp altogether may get your site flagged as "questionable" in the SERPs.”

“Fair warning.
I'm sticking with the decision, full ban, red card, sin bin for all slurp.

The rest of the world can have the 139 users (30% looking for an image that has not been on the site for over three years).

If the sites not good enough for yahoo then yahoo is not good enough for me (hugs Google)”

4.thumbnail Yahoo! Slurp gets Banned by Webmasters! Pros & Cons Debate Rages On!

Navneet Kaushal

Navneet Kaushal is the founder and CEO of PageTraffic, an SEO Agency in India with offices in Chicago, Mumbai and London. A leading search strategist, Navneet helps clients maintain an edge in search engines and the online media. Navneet's expertise has established PageTraffic as one of the most awarded and successful search marketing agencies.
4.thumbnail Yahoo! Slurp gets Banned by Webmasters! Pros & Cons Debate Rages On!
4.thumbnail Yahoo! Slurp gets Banned by Webmasters! Pros & Cons Debate Rages On!