Jan 31, 2012 113 readsby Navneet Kaushal

Google's John Mueller has stated that Google is limited to only processing up to 500KB of their robots.txt files.

John made a statement on his Google+ page that read, “#102 of the things to keep in mind when working on a big website: If you have a giant robots.txt file, remember that Googlebot will only read the first 500kb. If your robots.txt is longer, it can result in a line being truncated in an unwanted way. The simple solution is to limit your robots.txt files to a reasonable size :-).”

If your robot.txt file is super heavy and goes beyond 500 KB, then not only will it result in Googlebot ignoring anything that follows the limit but it will also auto truncate the file, which may effect your website's health in Google search.

John is currently answering all queries regarding robots.txt handling on his Google+ page. Click on the link to know more about robots.txt file controls.

Google Rules 500KB Crawl Limit For Robots.txt Files!, 2.0 out of 5 based on 2 ratings
Navneet Kaushal

Navneet Kaushal

Navneet Kaushal is the founder and CEO of PageTraffic, an SEO Agency in India with offices in Chicago, Mumbai and London. A leading search strategist, Navneet helps clients maintain an edge in search engines and the online media. Navneet's expertise has established PageTraffic as one of the most awarded and successful search marketing agencies.
Navneet Kaushal
Navneet Kaushal
FOLLOW US AND GET LATEST NEWS
Most popular Posts
Tweets
Upcoming Events
  • Events are coming soon, stay tuned!
More