Google Rules 500KB Crawl Limit For Robots.txt Files!

Jan 31, 2012 | 2,081 views | by Navneet Kaushal
VN:F [1.9.20_1166]
Rating: 2.0/5 (2 votes cast)

Google's John Mueller has stated that Google is limited to only processing up to 500KB of their robots.txt files.

John made a statement on his Google+ page that read, “#102 of the things to keep in mind when working on a big website: If you have a giant robots.txt file, remember that Googlebot will only read the first 500kb. If your robots.txt is longer, it can result in a line being truncated in an unwanted way. The simple solution is to limit your robots.txt files to a reasonable size icon smile Google Rules 500KB Crawl Limit For Robots.txt Files! .”

If your robot.txt file is super heavy and goes beyond 500 KB, then not only will it result in Googlebot ignoring anything that follows the limit but it will also auto truncate the file, which may effect your website's health in Google search.

John is currently answering all queries regarding robots.txt handling on his Google+ page. Click on the link to know more about robots.txt file controls.

Google Rules 500KB Crawl Limit For Robots.txt Files!, 2.0 out of 5 based on 2 ratings
4.thumbnail Google Rules 500KB Crawl Limit For Robots.txt Files!

Navneet Kaushal

Navneet Kaushal is the founder and CEO of PageTraffic, an SEO Agency in India with offices in Chicago, Mumbai and London. A leading search strategist, Navneet helps clients maintain an edge in search engines and the online media. Navneet's expertise has established PageTraffic as one of the most awarded and successful search marketing agencies.
4.thumbnail Google Rules 500KB Crawl Limit For Robots.txt Files!
4.thumbnail Google Rules 500KB Crawl Limit For Robots.txt Files!