Blocking JavaScript and CSS May Affect Indexing, Says Latest Google Webmaster Update!

Oct 28, 2014 | 1,992 views | by Ritu Sharma
VN:F [1.9.20_1166]
Rating: 5.0/5 (2 votes cast)

Google has recently updated its Guidelines for Webmasters. This is going to impact sites which are blocking JavaScript or CSS files. The Google Webmaster Central Blog 2014/10/ has announced that the indexing system has been updated to function more in a manner similar to a modern browser. This includes having active CSS and JavaScript. Therefore, websites blocking CSS or Javascript may now have to experience changes in the indexing system as a result of this.

Google provided clear advice regarding the permission for Googlebot to access the CSS, JavaScript and images files used by a website, saying:

"This provides you optimal rendering and indexing for your site. Disallowing crawling of JavaScript or CSS files in your site’s robots.txt directly harms how well our algorithms render and index your content and can result in suboptimal rankings.”

Google Indexing Advice Update

Webmasters will now experience a process change due tot he upgraded system. Google has cautioned that users must now longer see their indexing system as a "text-only browser."

Google also provided advice regarding the new phase in that:

  • The rendering engine of Google may be unable to support all technologies
  • Website design should stick to progressive enhancement principles for enabling engines to scan usable as well as supported content
  • Users should pay attention to page load speed in the context of indexing
  • Server should be enabled to support serving JavaScript and CSS files to Googlebot

Fetch & Render Diagnostic Tool Updated

Alongside this, Google has also updated its Fetch as a Google diagnostic tool created to enable webmasters to simulate the manner in which Google is crawling the URL on a specific website.

According to Google Support, the tool works in the following manner:

"When the Fetch as Google tool is in fetch mode, Googlebot crawls any URL that corresponds to the path that you requested. If Googlebot is able to successfully crawl your requested URL, you can review the response your site sent to Googlebot. This is a relatively quick, low-level operation that you can use to check or debug suspected network connectivity or security issues with your site.

The fetch and render mode tells Googlebot to crawl and display your page as browsers would display it to your audience. First, Googlebot gets all the resources referenced by your URL such as picture, CSS, and JavaScript files, running any code. to render or capture the visual layout of your page as an image. You can use the rendered image to detect differences between how Googlebot sees your page, and how your browser renders it."

A blogspot released by Google in the month of May this year informed webmasters these changes were on the anvil. The official post also provided information regarding some possible issues webmasters might face and how these can be dealt with. The instances also included:

“If your website is blocking JavaScript or CSS, Google's indexing system won’t be able to read the page like an average user.

There may be a negative impact on your website if your server is ill equipped to handle the volume of crawl requests.

Your pages may not be rendered properly if the JavaScript is too complex.

In some instances, JavaScript may remove not add content from a page which will prevent proper indexing of the page.”

Blocking JavaScript and CSS May Affect Indexing, Says Latest Google Webmaster Update!, 5.0 out of 5 based on 2 ratings
24.thumbnail Blocking JavaScript and CSS May Affect Indexing, Says Latest Google Webmaster Update!
Ritu from PageTraffic is a qualified Google Adwords Professional and Content Head at PageTraffic. She has been the spear head for many successful Search Marketing Campaigns and currently oversees Content Marketing operations of PageTraffic in India.