Create your own robots.txt
A robots.txt file is a short text file that resides in your home directory. Before search engines spider your site, they look into this file to see which files/file types and/or directories they are not allowed to see. This freeware utility will assist you in the creation of robots.txt files.
As a little 'extra' it also includes a block to many unwanted spiders that only crawl your site in order to collect the email addresses stored on your pages. (SPAM-bots)
Fill in the fields replacing the default values with your own ones. Create your robots.txt file, save it on your desktop and upload it at the root of your site.