Print

Block and stop bots, crawlers and search engine spiders from using your visitor statistics logs

Any online content is a subject for being indexed by search engines and other web cache scripts. In order to deliver readers to your website, the content on your website has to be analyzed and indexed within online search engines. Search engine indexing scripts or otherwise refereed to as robots, bots, crawlers or spiders are one of the major group of tools that ultimately make your content visible to your target audience.

fraudLog captures all of your visitor data along with majority of known search engine robots and other indexing scripts.

If too much of your visitor log space is being used by automatic bots, crawlers and search engine spiders, you can stop the indexing scripts from being logged by activating the [ log filter ] option.

To activate the filter, follow these steps:

  • Login to your account
  • Click on [My Projects] menu
  • Click on the [edit] link corresponding to the project that needs to be automatically filtered against the bots
  • Select [Do NOT log robot visits] from the "Log Filter" drop down menu
  • Click on the "Update" button
Once activated, this option will prevent the visitor, link, page and campaign trackers from logging majority of bots and you will no longer see the traces of their visits on your website from the point in time after which you have enabled this log filter.



Help

Help Desk

---> My Account
---> User Manuals
  ---> 1. Visitor Tracker
  ---> 2. Link & Page Trackers
  ---> 3. Campaign Tracker
  ---> 4. IP Address Book
© 2010 IP Web Tracker [IPWebTracker.com]. All Rights Reserved. All other trademarks are the property of their respective owners.