Before referrer/stat logs, I never realized how much traffic on our cc site is generated by search engine robots constantly trolling around. Yesterday we had 1466 page requests and a good percentage of them weren’t by actual people. They were using so much bandwidth that I set the robots.txt file to disallow the robots from certain directories, particularly one that has over 230 files in it. Most of the people hits we get from search engines are through generally related terms anyway, not about what’s in those specific files. We weren’t getting TOO close to our bandwidth limit our domain hosting service allows us, but still. In any case, this will be an interesting experiment to see if and how much our stats change.

This entry was posted in General. Bookmark the permalink.

Leave a Reply