In most cases the site owner is interested in the quickest and most complete indexing of the resource by search engines. However, there are cases where a site owner might wish to prevent certain pages from being indexed. There could be, for example, technical pages on the site (containing scripts, etc.) or temporary pages that are to be deleted. There is no point indexing such pages and itís not profitable for the resource owner or for the search engine. Because of this requirement a standard method was developed that informs search engine robots that certain sections of a website should not be indexed. This information is always contained in a file named robots.txt which should be located in the root catalog of the site (at www.site.com/robots.txt). Before starting the indexing of a resource the search engine robot requests the robots.txt file and if it is found it obeys the instructions in it.
If you wish all of your site to be indexed then simply omit the robots.txt file from your site, or include a blank one. Search engine robots will then scan your site without any limitations. If you wish to forbid indexing from parts of your website then you should learn more about the robots.txt format.
The robots.txt format is very simple. You should just specify the name of the robot (or use the * character to apply the file instructions to any robot) and list the forbidden sections.
It is possible to manually create a robots.txt file using a regular text editor without any difficulties. However, using our robots.txt generator program can help you to avoid accidental syntax errors when the file is created. A screenshot of the robots.txt generator editor window is shown below.
Download Seo Administrator - robots.txt generator