
"One could legitimately wonder if the counted visits appearing in Table 7-6 can be attributed to actual people or to robots, spiders, crawlers and the likes. Robots are usually small applications designed to gather data for search engine indexes. Unfortunately, the record of a virtual visit by a robot, spider or crawler looks no different in a Web server transaction log than that of a visit by an actual person. Robot visits may be traceable through name identifiers (i.e., Googlebot), or by a high volume of pages accessed in a very small timeframe, or by their requests for a file on the Web site called 'robots.txt.' The latter normally describes what may or may not be indexed from a given Web site by search engines. Nevertheless, if found, visits by robots must be systematically filtered and separately counted or just ignored when assessing the real usage impact of a Web site."
Continue reading this from the book on E-Metrics and the details @ eMetrics or Web-Metrics or Webometrics - A new book for Library & Information Professionals
2 comments:
Tag based solutions, like VisiStat do not count spiders and bots. It gives a good representation of what humans are doing on websites. www.visistat.com
Anony (I am sure you are not a robot or spider... kidding).
Thanks for the insight. Will be happy if you send me an email and that will give us an opportunity to enhance our understanding of fact and fiction.
Post a Comment