Tuesday, May 08, 2007

Block LIBWWW-PERL and web addresses to protect your site from botnets

Not only do I block all accesses from libwww-perl, I also log what they were looking for which turns up an amazing amount of botnet hits on a daily basis just randomly hitting websites trying to find a way inside.

The first trick to securing your site from the script kiddies is to block any user agent that contains "libwww-perl" which will stop the dumb ones from owning your site.

Try adding this to your .htaccess file:

RewriteCond %{HTTP_USER_AGENT} libwww [NC,OR]
The next trick is to filter out things in your QUERY_STRING such as "=http:" which is a typical in the botnet scripts that attempt to upload files to vulnerable software. This won't impact most other applications because file uploads tend to be done via a form and a POST, not a GET command.

With these 2 minor security changes you've eliminated many vulnerabilities from botnet attackers and blocked their method of uploading files.

It's not 100% but it may be enough to help you survive the next time your Open Source application gets a vulnerability until you can actually apply the patch.

4 comments:

KingofallGeeks said...

Hi Bill, I tried to send this to you at WMW, but your box is full. Here's a copy of what I tried to sticky to you:

Just wanted to get your opinion about one particular directory out there: starting point directory at www.stpt.com

Seems pretty good and with nice PR. But I haven't heard anyone recommend it. Is this a good one to submit sites to?

Thanks for your input. Hey, I think your blog is great by the way. I only wish I understood most of it.

Anonymous said...

IB wrote: "The next trick is to filter out things in your QUERY_STRING such as "=http:" which is a typical in the botnet scripts that attempt to upload files to vulnerable software."

Will yoy please tell us how?

Thanks :)

ShelaghG said...

I'd like to know too as I've had a big problem with bots recently spawning perl processes. For now the bots are gone but some extra bot-proofing is always good.

Thanks

Anonymous said...

Cool - it works. Requires Apache with .htaccess parsing though.