Forum Moderators: coopster

Message Too Old, No Replies

Watermarking Images on the fly

What are the preformance issues

         

neophyte

11:39 am on Sep 20, 2010 (gmt 0)

10+ Year Member



Hello All -

I've got the need to display images on a client's site with a watermark which, of course, php can do.

Now, the question is: If I do this on the fly with each page request, what are the performance issues of (a) taking the image (b) watermarking the image (c) serving the newly watermarked image to the browser. it's true that there will only be one watermarked image per page request, but the image on each page is a rather large (700 x 400px) full color image.

Is the php and the GD library fast enough at this to not slow the page/image rendering process too dramatically?

If the good folks here think that this process would ***noticeably*** affect image rendering, I guess an alternative would be to write a script that would batch process a directory of images off line and save those images complete with the watermark for on-line use.

All opinions gratefully appreciated.

Neophyte

Matthew1980

12:34 pm on Sep 20, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Hi there neophyte,

I think this will be a matter of preference, but personally I would do the watermarking when you have all your images in your chosen directory, and then build a script to periodically check it to see if there have been any new additions and if there are watermark them. I wouldn't do this OTF as I think this would create a lot of load on the parser & CPU especially If you have the potential of a lot a traffic, and if the images are the size that you have hinted at.

At least doing it this way there wouldn't be any noticeable 'delay' when loading a page up, and setting a cron job up (or even setting it as an admin area task/option to list all new additions then click process..) would be the better option.

Hope that makes sense.

Cheers,
MRb

neophyte

12:36 am on Sep 21, 2010 (gmt 0)

10+ Year Member



Matthew -

Upon considering the two options I have to agree with you that probably doing it off-line (pre-deployment) is probably the most sensible solution.

Thanks very much for your input.

Neophyte

ergophobe

5:02 pm on Sep 22, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I profiled a slow site at one point and was surprised to find that GetImageSize(), which I was simply using to build proper <img> tags with height and width, was eating up huge resources (about 10-12 calls depending on what was on the page at the time).

I solved this by caching the page on first request and serving up the cached version whenever possible.

Anyway, two lessons
- image ops can be very costly
- profiling will tell you just how costly.

neophyte

1:19 am on Sep 23, 2010 (gmt 0)

10+ Year Member



Hi Ergophobe -

Thanks for the information; interestingly I do exactly the same thing: call GetImageSize() in order to build proper <img> tags. Can you give me any information (or a link) explaining how to properly cache pages as well as profile a site?

I imagine that my pages are already cached - but don't know for certain - and I've never tried any profiling routines.

Thanks very much!

Neophyte

ergophobe

1:05 am on Sep 24, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



It depends a little on the page and the nature of the site. In my case, the front page doesn't change that much (only when I add new content, but it's not like there's a Twitter feed or something like that). So all I do is

- check to see if there's a cached copy in /cache/request/path/file where request/path/file would be the original URL.

- if not, start page generation.
+ turn on output buffering
+ generate page as normal
+ grab contents of buffer in a variable
+ output page
+ take buffer contents and write them to a file at /cache/request/path/file

- manually flush the buffer - in other words, I have a link for admin user only that does a hard page regeneration, whether the buffer exists or not. If you have more timely needs, you could do it as a cron job every 10 minutes or whenever a new page is added or whatever you need.

[php.net...]

ergophobe

1:42 am on Sep 24, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Oh, about profiling - just install Xdebug with CacheGrind or WinCacheGrind and it will give you mega output on how long every function in your script is taking, user-defined and native PHP functions included.

ergophobe

1:43 am on Sep 24, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



PS - don't set up profiling on a live site. Do it on a test site using conditions as similar as possible to your live site. Profiling writes data for every single function call and this puts a HUGE load on a site and will slow it to a crawl.

neophyte

12:33 am on Sep 25, 2010 (gmt 0)

10+ Year Member



ergophobe -

Wow, what a great rundown (especially about the profiling - I'd heard about that before but never really understood what it was or meant).

Thanks very much for the information!

Neophyte

ergophobe

5:43 am on Sep 26, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Handfull of threads at least partionally on various benchmarking/profile issues

[webmasterworld.com...]
[webmasterworld.com...]
[webmasterworld.com...]
[webmasterworld.com...]
[webmasterworld.com...]