In the BIG SHOCKER of the day department, when I installed my prototype bot blocker in another of my own websites today to expand my data stream wouldn't you know it but it caught an active spider that was crawling many hundreds of pages the second it was enabled.
This crazy assed spider it still going strong, or thinks it is, well over an hour after stopping it from getting real data.
The spider is Charlotte, with these specs:
209.249.86.4 "Mozilla/5.0 (compatible; Charlotte/1.0b; charlotte@betaspider.com)"Never heard of Charlotte before so I took a peek to see what others might know about it and stumbled into the most hysterical thread I've ever seen in the OsCommerce Forums about all these store owners frantically chasing down spider names and dropping spider names it into something called their spiders.txt file or some shit.
Dudes, blocking by user agent is so 1990's, you'll stress out and become incontinent doing it that way. What a waste of time, wish I was ready to help you all already, but such is life and quality takes time.
No comments:
Post a Comment