Jul 18, 2006 115 readsby Navneet Kaushal

Google's crawler constantly scours the  Internet for pages to index. This is why it's important to remove the pages you don't want to  get indexed. By using special code, you can disallow Googlebot indexing your pages under costruction or incomplete pages.

 A thread at  WebMasterWorld Forums  shows another example of pages you probably don't want indexed and that is your FTP logs. The memmer complains that "
 The first response is fairly obvious, indicating that all FTP log and other pages that you do not want indexed should be password protected, therefore making it impossible for the Googlebot to crawl. So knocking out links and assuming the pages are protected, could it still be possible for the Googlebot to find the URL and "accidentally" index it?"

Another member commented that FTP logsare catched by Googlebot.  So, it is better that all FTP logs and other pages you don't want indexed should be  password protected to disable Googlebot from crawling the sames pages. Besides, as you enable Google toolbar and the PageRank, it sends URL data to Google. So Google can find even the unlinked pages.

Navneet Kaushal

Navneet Kaushal

Navneet Kaushal is the founder and CEO of PageTraffic, an SEO Agency in India with offices in Chicago, Mumbai and London. A leading search strategist, Navneet helps clients maintain an edge in search engines and the online media. Navneet's expertise has established PageTraffic as one of the most awarded and successful search marketing agencies.
Navneet Kaushal
Navneet Kaushal
Most popular Posts
Upcoming Events
  • Events are coming soon, stay tuned!