Gas prices don't bother bots - they just want content. Or just email addresses, if they are that kind of bot. Anyways, that was my attempt at relating bots to current events.
User Agent Test Track.
Of course, the script didn't get a chance to execute because we plan for this sort of stuff.
While monitoring this week, a bot from page-store.com got on my radar from their volume / rate of crawl.
Nothing worth banning, but it got my attention.
Luckily, they were kind enough to provide an email address in the user agent.
I emailed them, and the problem was very quickly resolved.
I also asked them to tell us a little about what they do.
From their reply, Page-store contracts with search-engine startup companies to supply them with web crawl data.
The value added includes site-aggregation, character encoding handling, language filtering, porn filtering, spam filtering, on-demand depth-k crawl, site-level and URL-level web-link data.
Unlike many of the bots / scrapers we run across, this one actually has a legitimate purpose, and responded to our request to conserve bandwidth.
BebopBot/2.5.1, which apparently has a passion for Jazz, stopped in this week.
Here's a list of some other interesting bots that made their debut in our logs this week:
After this week, our database is up to 182,223
user agents and 1,957