| Welcome to WebmasterWorld Guest from 18.104.22.168 |
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
|Subscribe to WebmasterWorld|
|gzip, find and ownership|
| 7:31 pm on Aug 13, 2008 (gmt 0)|
I'm looking for a way to create preprocessed .gz files of static pages to serve up to those who can accept them.
I know I can use:
gzip -c --best index.html > index.html.gz
to create the .gz file _and_ keep the original.
What's the proper command line way to run that on each index.html file in all the subdirectories? There's a way via find, right? Can it be done from the command line, or do I need a script?
Also, if I run the cron as a user other than root (i.e. my apache user) will the resulting gzip be owned by apache or do I have to chown it?
Thanks for any advice!
All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
© Webmaster World 1996-2013 all rights reserved