Google's crawler constantly scours the Internet for pages to index. This is why it's important to remove the pages you don't want to get indexed. By using special code, you can disallow Googlebot indexing your pages under costruction or incomplete pages.
A thread at WebMasterWorld Forums shows another example of pages you probably don't want indexed and that is your FTP logs. The memmer complains that "
The first response is fairly obvious, indicating that all FTP log and other pages that you do not want indexed should be password protected, therefore making it impossible for the Googlebot to crawl. So knocking out links and assuming the pages are protected, could it still be possible for the Googlebot to find the URL and "accidentally" index it?"
Another member commented that FTP logsare catched by Googlebot. So, it is better that all FTP logs and other pages you don't want indexed should be password protected to disable Googlebot from crawling the sames pages. Besides, as you enable Google toolbar and the PageRank, it sends URL data to Google. So Google can find even the unlinked pages.