Debating on whether content on websites should include sitemaps or just set up RSS/ Atom feeds, webmasters can now rely on Google's official BlogSpot for the answer to that question. Google advocates the use of both XML Sitemaps as well as RSS/ATOM Feeds for ensuring optimal crawling.
Here's what Google has to say on this matter, “For optimal crawling, we recommend using both XML sitemaps and RSS/Atom feeds. XML sitemaps will give Google information about all of the pages on your site. RSS/Atom feeds will provide all updates on your site, helping Google to keep your content fresher in its index. Note that submitting sitemaps or feeds does not guarantee the indexing of those URLs.”
Google also provides important information on how to use XML sitemaps and RSS? Atom feeds such as the URLs and the last modification time. Google says that URLs in XML sitemaps and RSS/ Atom feeds should be those which can be fetched by Googlebot. Canonical URLs should be included rather than those of duplicate pages. The last modification time for each URL in the sitemap and RSS/ Atom feed should be specified. Google also advises that XML sitemaps should contain URLs of the pages which are on the site. The search engine giant has provided guidelines for single as well as multiple XML sitemaps. The RSS/Atom feed should possess updates within it at least till the previous time it was downloaded by Google.
Google also talks of how optimal crawling is possible through inclusion of sitemaps as well as RSS/Atom Feeds saying that “Generating both XML sitemaps and Atom/RSS feeds is a great way to optimize crawling of a site for Google and other search engines. The key information in these files is the canonical URL and the time of the last modification of pages within the website. Setting these properly, and notifying Google and other search engines through sitemaps, pings and PubSubHubbub, will allow your website to be crawled optimally, and represented accordingly in search results.”