WebbanalysKeymetrics

Random Amazon bot visits

Visits by bots are normal for a site, also any normal compliant bot will identify as such and provide a link to an information page. Bots out of Amazons IP range play it differently.

Amazon bots

Random Amazon bot visitsGetting visits from the Amazon IP ranges by bots, faking the user agent to appear as a normal user browser, are easily detected in the server log files. These visitors basically call for the home page URL and nothing more making them stand out compared to a regular visitor.

It becomes very odd when a subdomain that only exists in a DNS record starts getting such bot requests, it is apparent that DNS trawling thru DNS records in order to find web servers is the source which such bots use. Their purpose remains unclear and impossible to find out, thus leaves only 2 choices to handle them as they ignore the robots.txt file.

Only blocking solution

One can either ignore or block these requests, and blocking means one can 403 all such requests to all sites of the web server. If, and that's a massive if, the creator of the bots had any sense they would note 403 errors and not revisit. Better yet they would have proper bot user agents and follow the robots.txt settings.

That isn't how the logic for these bots works, and the end result is blocking all Amazon IP ranges as they show up on the web server.

This way the 403 error code makes it easy to exclude all of such visitors in any analysis based on the server log file.

---