As a part of its drive for complete removal of support for noindex directive in robots.txt files, Google is notifying webmasters who have such directives. Since yesterday, many webmasters started getting notifications from Google Search Console with a subject line ‘’Remove ‘’noindex’’ statements from the robots.txt of …’’. Here is a screenshot by Bill Hartzer on Twitter-

From September 1, 2019, you won’t have to rely on the noindex notification on your robots.txt file. Google has already made this announcement earlier in July and is now sending out notifications to spread the word about this change.

What Is The Significance of This Change?

If you receive this notification, ensure that everything you have mentioned in this noindex directive is supported in another way. Make sure that you discontinue using noindex directive on your robot.txt file. If you have already done that, make the prescribed changes before September 1. Check if you are using nofollow or crawl-delay commands. If you do, try to use the true supported method for these directives from now on.

What Are The Available Alternatives?

Google has listed several options, some of which you may have been using in one way or the other-

  • Noindex in robots meta tags: Supported both in HTTP response headers and in HTML, noindex directive is the most effective way for removing URLs from the index when crawling is permitted.
  • 404 and 410 HTTP status codes: Both status codes mean that the page doesn’t exist, which will drop these URLs from Google’s index after they are crawled and processed.
  • Password protection: Unless markup is utilized to indicate subscription or paywalled content, concealing a page behind a login will lead to its removal from Google’s index.
  • Disallow in robots.txt: Search engines can only index those pages which they are familiar with. Therefore, blocking the page from being crawled means that the content won’t be indexed. Though a search engine may utilize a URL based on links from other web pages, without viewing those pages, Google aims to make these pages less visible in the days to come.
  • Search Console Remove URL tool-This tool is a quick and simple method for removing a URL temporarily from Google’s search results.

Ritu from PageTraffic is a qualified Google Ads Professional and Content Head at PageTraffic. She has been the spear head for many successful Search Marketing Campaigns and currently oversees Content Marketing operations of PageTraffic in India.