Google has told on the Official Google Blog about the launch of a new feature that will use RSS and Atom feeds to discover new webpages. With the new feature, the process of finding new pages on Google will become quicker as the new content will be displayed on the search result as soon it will go live.
In the recent years, World Wide Web has taken RSS/Atom feeds as a popular mechanism for content publication. With the feeds, readers can check for the new content and get these pages indexed more quickly than the traditional crawling methods. The Google team is talking of using potential sources to access updates from feeds like notification services, Reader or direct crawls. In the coming time, the team might be able to explore other mechanisms like PubSubHubbub for identifying recently updated items.
If you want Google to use your RSS/Atom feeds for discovery, make sure that your robots.txt does not disallow the crawling of these files. If you want to know that Googlebot can crawl your feeds to find pages faster, you can test the feed URLs with robots.txt tester on the Google Webmaster Tools.