Forum Moderators: coopster
I have a directory with what will be about 100,000 images (i'm building a GIS System).
What is happening at the moment i'm holding said images in a directory and requesting them when required.
this works great but the problem is someone could run a bot and download the whole lot.
I would like to pull out of a secure area (or whatever) the 9 images that the user requests, thereby limiting the exposure.
I have used the image protect on my webserver before but this seems to block anyone using norton from seeing them (useless piece of...)
Are there any solutions in PHP that won't melt the webserver under load or possible a local image protect in .htaccess (sod norton users)?
Any help would be most appreciated.
Cheers,
Hughie
But you have to be careful about how you predict user behavior... What if a user opens four or five pages in different tabs to load while they look at the first page? They may request 40-50 of your protected images in the course of a few seconds and if your image serving script decides they were a bot and bans them, you may lose the visitor.
The bad bot script here [webmasterworld.com] uses a header like:
header( 'HTTP/1.0 503 Service Temporarily Unavailable' );
Perhaps that in combination with an interface tip like "Don't see all the images? Wait a few seconds and hit refresh as we may be under heavy traffic." would be better?