Some prominent changes are made on the basis of Google’s decision to stop supporting noindex directive in robots.txt.

September 1 is just a blink away and now Google is here to remind us all that those who use noindex direct in a robots.txt will be in distress. This implies that Google will stop lending support to unsupported and unpublished rules in the robots exclusive protocol. Thus, to make your way via robots.text, you will have to begin using other alternatives.

“In the interest of maintaining a healthy ecosystem and preparing for potential future open source releases, we’re retiring all code that handles unsupported and unpublished rules (such as noindex) on September 1, 2019. For those of you who relied on the noindex indexing directive in the robots.txt file, which controls crawling, there are a number of alternative options,” the company said.

What Are The Accessible Alternatives?

Google shortlists some viable options, the ones that you should have been using anyway:

  1. Noindex (Robots meta tags): Supported in the HTTP response headers as well as in HTML, the noindex directive easily removes URLs from the index when crawling is permitted.
  2. 404 and 410 HTTP status codes: Both status codes imply that the page is non-existent. It will drop such URLs from Google index once they are processed.
  3. Password Protection: Until and unless markup is used to indicate subscription or paywalled content, hiding a page at the back of a login will probably remove it from Google’s index.
  4. Invalidation In robots.txt: Search engines can only index pages they are familiar with. Thus, blocking the page from being crawled often means its content won’t be indexed.
  5. Search Console Remove URL tool: This tool works at its best to easily remove a URL temporarily from Google’s search results.

Setting High Standards

Google announced that the company is working really hard on building robots exclusion protocol a standard. Along with this announcement Google released its robots.txt parser as an open source project.

Change For Better

The most important thing is to make sure that you completely dismiss using the noindex directive in the robots.txt file. But if you still do, stop and adhere to the suggested changes above before September 1. PS: To use the true supported methods for those directives.

Author

Ritu from PageTraffic is a qualified Google Ads Professional and Content Head at PageTraffic. She has been the spear head for many successful Search Marketing Campaigns and currently oversees Content Marketing operations of PageTraffic in India.