Mar 13, 2012 116 reads by Navneet Kaushal

Google has revamped the crawl errors in Google Webmaster Tools. Now webmasters will observe quite a few user interface changes.

Errors have been classified as site errors and URL errors.

Site Errors

Site errors include- DNS resolution failures, connectivity issues with web server, and complications with the robots.txt file. Earlier they were reported by the URL, even though site errors are not specific to the individual URLs. Now these errors will be tracked for each type of site-wide error. As Google says, “We’ll also try to send you alerts when these errors become frequent enough that they warrant attention. Furthermore, if you don’t have (and haven’t recently had) any problems in these areas, as is the case for many sites, we won’t bother you with this section. Instead, we’ll just show you some friendly check marks to let you know everything is hunky-dory.”

Site Errors

URL errors

Google has broken the page specific URL errors further into various categories defined by the cause of the error. Google says, “If your site serves up Google News or mobile (CHTML/XHTML) data, we’ll show separate categories for those errors.”

More Important Errors upfront

Less is more for the revamped crawl errors as Google will now tell you first about the important errors. Webmaster will not have to sort thru 100,000 errors of each type and decide which errors are the important ones and which are less important. Making progress thru errors is now easy as Google says, “In the new version of this feature, we’ve focused on trying to give you only the most important errors up front. For each category, we’ll give you what we think are the 1000 most important and actionable errors. Some sites have more than 1000 errors of a given type, so you’ll still be able to see the total number of errors you have of each type, as well as a graph showing historical data going back 90 days. For those who worry that 1000 error details plus a total aggregate count will not be enough, we’re considering adding programmatic access (an API) to allow you to download every last error you have, so please give us feedback if you need more.”

Less is More

The list of pages blocked by robots.txt has been removed as Google wanted to focus on errors. Webmasters can find roboted URLs information in the "Crawler access" feature under "Site configuration".

Getting the Error Details

Webmasters can click on the link for the error URL and see a brief explanation about the error and when Google noted it. Users can mark fixed errors, list Sitemaps that contain the URL, see links and also view help content for the error type.

Fix those Errors!

Webmasters can now fix the errors easily as they will get the errors ranked in order of their importance. So users can mend their links, fix bugs in their server software and update Sitemaps and see to other errors at the earliest.

Fix Errors

As Google says, "We determine this based on a multitude of factors, including whether or not you included the URL in a Sitemap, how many places it’s linked from (and if any of those are also on your site), and whether the URL has gotten any traffic recently from search. Once you think you’ve fixed the issue (you can test your fix by fetching the URL as Googlebot), you can let us know by marking the error as “fixed” if you are a user with full access permissions"

What do you think of this face lift? Do share your views.

Navneet Kaushal

Navneet Kaushal

Navneet Kaushal is the founder and CEO of PageTraffic, an SEO Agency in India with offices in Chicago, Mumbai and London. A leading search strategist, Navneet helps clients maintain an edge in search engines and the online media. Navneet's expertise has established PageTraffic as one of the most awarded and successful search marketing agencies.
Navneet Kaushal
Navneet Kaushal
Most popular Posts
Upcoming Events
Events are coming soon, stay tuned!More