Vanessa Fox on Official Google Central Webmasters Blog has compiled some tips for building crawlable sites.
Vanessa has explained pretty well how to make sure that visitors and search engines have access to the content.
The key step is to ensure the above are:
- Check the Crawl errors section of webmaster tools for any pages Googlebot couldn't access due to server or other errors. If Googlebot can't access the pages, they won't be indexed and visitors likely can't access them either.
- Make sure your robots.txt file doesn't accidentally block search engines from content you want indexed. You can see a list of the files Googlebot was blocked from crawling in webmaster tools. You can also use our robots.txt analysis tool to make sure you're blocking and allowing the files you intend.
- Check the Googlebot activity reports to see how long it takes to download a page of your site to make sure you don't have any network slowness issues.
- If pages of your site require a login and you want the content from those pages indexed, ensure you include a substantial amount of indexable content on pages that aren't behind the login. For instance, you can put several content-rich paragraphs of an article outside the login area, with a login link that leads to the rest of the article.
- How accessible is your site? How does it look in mobile browsers and screen readers? It's well worth testing your site under these conditions and ensuring that visitors can access the content of the site using any of these mechanisms.
Vanessa has also explained how to make sure that the content is viewable and how to keep the site crawlable.
For a better insight read the full post.