Google search engine crawlers, starting from Q2 2018, will start yielding JavaScript-based web pages, without any support from AJAX crawling scheme, Google made an official statement.

It means, there will be one-notch lesser work for site owners, as now they won’t be required to provide rendered versions of these pages to Googlebot.

Currently, Googlebot depends on the AJAX crawling scheme to generate JavaScript-based webpages when rendered versions are not provided.

But now, with Google engineers’ advancement, Googlebot would be able to generate the rendered pages in-house.

2018 second quarter onwards, Google will completely rely on Googlebot, instead of AJAX crawling scheme.

Exactly What Is Going To Change

The AJAX crawling scheme accepts pages with either a “#!” in the URL or a “fragment meta tag”, and then crawls them with an “?_escaped_fragment_=” in the URL.

At present, for Googlebot to crawl the page, the “escaped” URL has to be a fully rendered version of the “#!” URL.

After the switch, Googlebot will be able to render the “#!” URL on its own.the existing URLs will continue to be supported, but the compulsion to provide rendered versions by website owners will not be there.

AJAX crawling websites won’t face any major alterations, Google says. If any issue arises, individual sites will directly and officially be informed.

Author

Ritu from PageTraffic is a qualified Google Ads Professional and Content Head at PageTraffic. She has been the spear head for many successful Search Marketing Campaigns and currently oversees Content Marketing operations of PageTraffic in India.