Thursday, June 5, 2008

How do I use a robots.txt file to control access to my site?

A robots.txt file provides restrictions to search engine robots (known as "bots") that crawl the web. These bots are automated, and before they access pages of a site, they check to see if a robots.txt file exists that prevents them from accessing certain pages.
You need a robots.txt file only if your site includes content that you don't want search engines to index. If you want search engines to index everything in your site, you don't need a robots.txt file (not even an empty one).
For more information go to below mentioned link
http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=40360