RapidBot is a RapidWeaver plug-in that allows you to quickly create a robots.txt file for your site.
Robots.txt file is retrieved by search engines, like Google and Bing, and used to state what pages need indexing and what must be ignored.
A robot (also known as Spider or Web Crawler) is a program that automatically traverses the web's hypertext structure by retrieving a document and recursively retrieving all documents that are referenced.
You decide what spiders will visit in your site. Choose a crawler among our presets or a custom one. Then type a folder name or select it using default link picker. RapidBot is perfectly integrated with RapidWeaver!
Provide specific indexing rules thanks to RapidBot. Enable or disable files and folders crawling in a natural way: RapidBot will translate your directives into something that is understood by search engine bots.
Don't forget that instructing search engines bots help you adding visibility to your site and let people reach you in a more efficient way excluding irrelevant contents.
RapidBot is a must-have SEO tool!
Only if search engines know what to do with your pages, they can give you a good ranking.
What's New in This Release: [ read full changelog ]
· Brand new interface.
· Completely rewritten plugin.
· Support for multiple directives for each rule.