The simplest of all meta tags, the robots tag, signals the Googlebot, Google’s search engine spider, to crawl your entire website. In order to index your website properly and include all of your web pages, search engines send their spiders to review and scan your website on a regular basis. Google does this every two or three days.
When the spiders view your meta tags and see that your robots tag indicates “all,” they simply start crawling. Although some spiders would search the majority of your site without the tag, having it provides the added direction to search engine crawlers. Make sure the robots tag is included in your meta tags to improve crawling.
There are some Internet marketers or webmasters who recommend submitting each page of your site directly to the search engines via single page submission. This isn’t necessary, especially if you are including the robots tag. Search engine crawlers do the work for you.
What’s important is that Google indexes your site, and when it does, it can find all of your content. The robots tag can help with that process. Equally, if not more important, is compliance with W3C standards (industry accepted HTML standards) and a sitemap. When you combine the robots tag with an easily indexed website, Google and other major search engines can find and index all of the pages on your website or blog.