Over at SEOmoz, Duncan Morris has come up with an informative post that provides the users with the pros and cons of using XML Sitemaps.
Sitemaps.org defines sitemap as, â€œSitemaps are an easy way for webmasters to inform search engines about pages on their sites that are available for crawling. In its simplest form, a Sitemap is an XML file that lists URLs for a site along with additional metadata about each URL (when it was last updated, how often it usually changes, and how important it is, relative to other URLs in the site) so that search engines can more intelligently crawl the site.
Web crawlers usually discover pages from links within the site and from other sites. Sitemaps supplement this data to allow crawlers that support Sitemaps to pick up all URLs in the Sitemap and learn about those URLs using the associated metadata. Using the Sitemap protocol does not guarantee that web pages are included in search engines, but provides hints for web crawlers to do a better job of crawling your site.”
How is XML Sitemaps Useful:
- A Sitemap has the ability of listing all of the URLs that you have in your website. This helps the search engines by enabling their crawlers to crawl and index those pages that the crawlers couldn’t locate previously.
- You can set the priority of your webpage by tapping into the optional tag in the sitemap that enables you to assign a priority to your pages. Hence, search engine crawlers are able to crawl and index your pages, as per the pages’ priority.
- Optional tags such as “lastmod” and â€œchangefreqâ€ should be used more frequently to keep the search engines informed about the changes that you’ve made to your website and frequency at which you make the changes in your pages. Use the “lastmod” to let the crawlers know as to when the page was last modified. â€œchangefreqâ€ can be used to mention as to how frequently a page changes.
More information about above mentioned steps is available at Google Webmaster Tools.
The flip side of using XML Sitemaps:
- One of the major disadvantages of using the XML Sitemaps, is that when you assign priorities to your pages, then not only crawlers have access to them, even your competitors can view them and that could create an unwanted surge in the competition.
- Due to the ever evolving nature of search engines and their crawlers, it becomes an almost impossible task to keep your sitemaps young. In fact, sitemaps generated from a database become obsolete, the moment they are crawled and indexed.
For more information and a detailed perspective on the pros and cons of using XML Sitemaps, please visit SEOmoz