Sunday, March 12, 2006

Knuckle Scraping Neanderthal

When a scraper reads your robots.txt file don't you think they would avoid the disallowed pages and directories?

Then would you believe the scraper reads your robots.txt file a SECOND time after just downloading a few pages and immediately opens the page that it's told to leave alone and WHAMMO! gets stopped.

How FUCKING STUPID can you be to write such brain damaged code?

4 comments:

Anonymous said...

Probably because it was a human tyring to figure out what you were hiding?

HopeSeekr of xMule

Anonymous said...

If that was the case anonymous then Bill got the title for this rant correct.

IncrediBILL said...

Nope, wasn't a human, no graphics ever loaded and not using Lynx.

Trust me on one thing, my profiling techniques are pretty much spot on and I test it myself several times a week to see if I could get trapped and humans can escape with a single click or two, it's not designed to hold humans.

Bots just keeping going even when being challenged to do something to prove it isn't a bot, so 20+ pages later I'm sure it's not a human.

Anonymous said...

In which case Bill your title is way off.

A better one would be "Knuckle Scraping Cylon Toasters".