Google has updated its robots.txt tool in Webmaster Tools to make it easier to maintain and make robots.txt files.
Now Webmasters can see the robots.txt file and even test new URLs to check whether they're disallowed for crawling. The new tool will also highlight a specific directive that led to the final decision to guide your way through complicated directives. Webmasters can now test and make changes in the file by simply uploading the new version of the file to their server to make the changes take effect before making the file live.
In addition to the above feature, Webmasters will also be able to review the older versions of their robots.txt file, and check the past issues.
The updated tool will make it easier for Webmasters to maintain & test robots.txt file.