Robots.txt: This file is used to communicate with search engine spiders (a.k.a. "robots"), telling them what they cannot have access to. This file is only a suggestion to the spiders, and is not a way to secure the filesystem.
To exclude all robots from the entire server, place the following in the file:
To allow all robots complete access to the webroot, place the following in the file:
Note: This can be a problem if your hosting company accidentally places a robots.txt file in the server's main directory- spiders coming to the host's client sites will read the robot's file and NOT index any of the sites.