Google's John Mueller has stated that Google is limited to only processing up to 500KB of their robots.txt files.
John made a statement on his Google+ page that read, “#102 of the things to keep in mind when working on a big website: If you have a giant robots.txt file, remember that Googlebot will only read the first 500kb. If your robots.txt is longer, it can result in a line being truncated in an unwanted way. The simple solution is to limit your robots.txt files to a reasonable size .”
If your robot.txt file is super heavy and goes beyond 500 KB, then not only will it result in Googlebot ignoring anything that follows the limit but it will also auto truncate the file, which may effect your website's health in Google search.
John is currently answering all queries regarding robots.txt handling on his Google+ page. Click on the link to know more about robots.txt file controls.Google Rules 500KB Crawl Limit For Robots.txt Files!,