bookmark this page - make qweas your homepage | ||
Help Center - What's New - Newsletter - Press | ||
Get Buttons - Link to Us - Feedback - Contact Us |
Home | Download | Store | New Releases | Most Popular | Editor Picks | Special Prices | Rate | News | FAQ |
|
Page Promoter 7.2 - User Guide and FAQScreenshots - More DetailsRobots.txt generator The Robots Exclusion Protocol is a method that allows Web site administrators to indicate to visiting robots which parts of their site should not be visited by the robot. Before indexing your site, the spider downloads robots.txt file that contains instructions on what should and what should not be indexed. Therefore a key to controlling spiders is robots.txt file. If you have a large website or update it frequently, creating and editing it will be a hard and dull work. Robots.txt Editor is an easy-to-navigate visual editor that will enable you to specify different directives for selected spiders in specific areas of the site and generate the robots.txt file quickly and easily. The robots.txt file is just a simple text file, containing instructions for search engine robots. There are different types of instructions: for individual robots, for all robots, for certain folders, for file types,etc. The file can be created using a simple text editor like notepad or wordpad, but it is very difficult to create such file manually and not to make a mistake somewhere. There are two tabs in on this module: Spider List tab and Disallow tab. On the Spider tab you can view names of all robots that are in the program database (176 altogether). There are three fields for each robot. You can group robots alphabetically by any of these fields by clicking on the column header. Frequently Asked Questions - Page Promoter
Screenshots - More Details |
Search -
Download -
Store -
Directory -
Service -
Developer Center
© 2006 Qweas Home - Privacy Policy - Terms of Use - Site Map - About Qweas |