When my bot buster first started operating I noticed a few gibberish user agent strings now and then as I'm sure the theory behind this is if a website is blocking known user agents then you can skirt past that technique with a string of gibberish.
The problem is that they've noticed nothing is getting thru and random user agent string usage against my site is escalating to the point it's hysterical to witness them thrashing.
Small sampling of thousands the other day:
220.127.116.11 2uigq2oecesvv2nwso rwiakBsBue Bobgw2nuBThis is why I keep preaching that blocking by user agent only works for the legit crawlers that want to allow you to block them but the scrapers aren't playing by any of the old rules and I'm shocked these idiots just didn't use a browser string which has a better chance at least getting a handfull of pages of my web site until they get too greedy.
18.104.22.168 emwx4cxnd pedafhfpac
22.214.171.124 pepgfu wjdjqrxckulhwiflmrdsmkc mjvldn
126.96.36.199 mairwthe Ifirpl8tiwotwyi lsu
188.8.131.52 r9Hreiynmkxmpjh ioHmmknpdmid
184.108.40.206 ewoqaohlcegoD emkdywx
220.127.116.11 bedmdFjkFhc4a noFjajakffieapvngdtpwxk
18.104.22.168 aphErvbtijj vulgctlslo
22.214.171.124 jgbhwntsdlprxcwogijI8orrw b8
126.96.36.199 DrbspcgyubxrpeikfiihxD mh
188.8.131.52 cvwkvl6kfujhqlujqblFl dffrepmrxdspmdFjq
Sorry webmasters but the rules have changed in how this game is being played and you really need to block all non-browser agents and allow legit crawlers like Google, Yahoo and MSN by IP only. Any other method is just wasting your time as random user agents cannot be stopped by your old traditional techniques.
The blacklist is out, the whitelist is in.