Beyond the obvious StarWars reference in the title of this post, the Force is not strong with this crawler. This Jeteye crawler is bucking for the Empire's buggy software award today as it came and asked for robots.txt a whopping 11 consecutive times, and THEN asked for 11 other pages.
71.5.15.254 - "GET /robots.txt HTTP/1.1" "jeteyebot/0.1; http://www.jeteye.com/bot.html"Can anyone really write software that bad after gradeschool?
71.5.15.254 - "GET /robots.txt HTTP/1.1" "jeteyebot/0.1; http://www.jeteye.com/bot.html"
71.5.15.254 - "GET /robots.txt HTTP/1.1" "jeteyebot/0.1; http://www.jeteye.com/bot.html"
71.5.15.254 - "GET /robots.txt HTTP/1.1" "jeteyebot/0.1; http://www.jeteye.com/bot.html"
71.5.15.254 - "GET /robots.txt HTTP/1.1" "jeteyebot/0.1; http://www.jeteye.com/bot.html"
71.5.15.254 - "GET /robots.txt HTTP/1.1" "jeteyebot/0.1; http://www.jeteye.com/bot.html"
71.5.15.254 - "GET /robots.txt HTTP/1.1" "jeteyebot/0.1; http://www.jeteye.com/bot.html"
71.5.15.254 - "GET /robots.txt HTTP/1.1" "jeteyebot/0.1; http://www.jeteye.com/bot.html"
71.5.15.254 - "GET /robots.txt HTTP/1.1" "jeteyebot/0.1; http://www.jeteye.com/bot.html"
71.5.15.254 - "GET /robots.txt HTTP/1.1" "jeteyebot/0.1; http://www.jeteye.com/bot.html"
71.5.15.254 - "GET /robots.txt HTTP/1.1" "jeteyebot/0.1; http://www.jeteye.com/bot.html"
I weep for the future.
1 comment:
One robots.txt request per file is a bit much.
Were any of the 11 files it took disallowed?
Post a Comment