This swine from the land of overpriced wine asked for robots.txt then tried to rip over 1300 pages.
220.127.116.11 "GET /robots.txt HTTP/1.0" 200 146 "-" "-"Too bad Pepe Le Pew, your feeble scraping attempts SUCK and you got 1300+ pages of error messages so Phuck Off.
18.104.22.168 [ALille-252-1-48-2.w83-198.abo.wanadoo.fr.] requested 1321 pages as "Mozilla/4.0 (compatible; MSIE 5.0; Windows NT 4.0)"
[sing a long with apologies to Cheryl Crow...]
All I wanadoo is scrape some pages,
We'll download it, and not take ages.
All I wanadoo is grab your site,
And then cloak it all to Google tonight!