Forum Moderators: coopster

Message Too Old, No Replies

serving images with php

want to hold images in secure area and pass only requested ones to user

         

hughie

9:19 am on Apr 25, 2006 (gmt 0)

10+ Year Member



HI There

I have a directory with what will be about 100,000 images (i'm building a GIS System).

What is happening at the moment i'm holding said images in a directory and requesting them when required.

this works great but the problem is someone could run a bot and download the whole lot.

I would like to pull out of a secure area (or whatever) the 9 images that the user requests, thereby limiting the exposure.

I have used the image protect on my webserver before but this seems to block anyone using norton from seeing them (useless piece of...)

Are there any solutions in PHP that won't melt the webserver under load or possible a local image protect in .htaccess (sod norton users)?

Any help would be most appreciated.
Cheers,
Hughie

hughie

5:04 pm on Apr 25, 2006 (gmt 0)

10+ Year Member



anyone....

whoisgregg

6:34 pm on Apr 25, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



You could setup an image serving script that decides whether or not to serve up the requested image based on visitor IP, cookie, session, requests in the last hour, etc.

But you have to be careful about how you predict user behavior... What if a user opens four or five pages in different tabs to load while they look at the first page? They may request 40-50 of your protected images in the course of a few seconds and if your image serving script decides they were a bot and bans them, you may lose the visitor.

hughie

9:41 pm on Apr 25, 2006 (gmt 0)

10+ Year Member



thats a good idea, i agree that it has limitations but it could be managed by allowing x views per minute for open users and then something like

"register to increase your page views"

and "give me money to get even more" ;-)

i think i'll look into that.

many thanks,
hughie

whoisgregg

10:34 pm on Apr 26, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Don't forget that robots may hammer for your images and serving them alternate content when they have passed a certain threshold has the potential to increase your bandwidth cost (also just to confuse the heck out of them when they get html instead of image data).

The bad bot script here [webmasterworld.com] uses a header like:

header( 'HTTP/1.0 503 Service Temporarily Unavailable' ); 

Perhaps that in combination with an interface tip like "Don't see all the images? Wait a few seconds and hit refresh as we may be under heavy traffic." would be better?