When my bot buster first started operating I noticed a few gibberish user agent strings now and then as I'm sure the theory behind this is if a website is blocking known user agents then you can skirt past that technique with a string of gibberish.
The problem is that they've noticed nothing is getting thru and random user agent string usage against my site is escalating to the point it's hysterical to witness them thrashing.
Small sampling of thousands the other day:
22.214.171.124 2uigq2oecesvv2nwso rwiakBsBue Bobgw2nuBThis is why I keep preaching that blocking by user agent only works for the legit crawlers that want to allow you to block them but the scrapers aren't playing by any of the old rules and I'm shocked these idiots just didn't use a browser string which has a better chance at least getting a handfull of pages of my web site until they get too greedy.
126.96.36.199 emwx4cxnd pedafhfpac
188.8.131.52 pepgfu wjdjqrxckulhwiflmrdsmkc mjvldn
184.108.40.206 mairwthe Ifirpl8tiwotwyi lsu
220.127.116.11 r9Hreiynmkxmpjh ioHmmknpdmid
18.104.22.168 ewoqaohlcegoD emkdywx
22.214.171.124 bedmdFjkFhc4a noFjajakffieapvngdtpwxk
126.96.36.199 aphErvbtijj vulgctlslo
188.8.131.52 jgbhwntsdlprxcwogijI8orrw b8
184.108.40.206 DrbspcgyubxrpeikfiihxD mh
220.127.116.11 cvwkvl6kfujhqlujqblFl dffrepmrxdspmdFjq
Sorry webmasters but the rules have changed in how this game is being played and you really need to block all non-browser agents and allow legit crawlers like Google, Yahoo and MSN by IP only. Any other method is just wasting your time as random user agents cannot be stopped by your old traditional techniques.
The blacklist is out, the whitelist is in.