homepage Welcome to WebmasterWorld Guest from 54.237.38.30
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Code, Content, and Presentation / PHP Server Side Scripting
Forum Library, Charter, Moderators: coopster & jatar k

PHP Server Side Scripting Forum

    
WGET to download file
alphacooler

5+ Year Member



 
Msg#: 3206080 posted 7:21 pm on Jan 2, 2007 (gmt 0)

Here is a script I found to download files using the commandline and WGET. But unfortunately the comments of the orig programmer aren't quite enough for me to understand.

Here is the script:

[URL="http://www.zend.com/tips/tips.php?id=91&single=1"]http://www.zend.com/tips/tips.php?id=91&single=1[/URL]

I know where the path to wget on my server (/usr/bin/wget), but i'm not sure what to put in the following:

[LIST]

  • $destination-path
  • $url (url of file I want to d/l presumably)
  • /path-to-outfile/output
  • $temp = `/path-to-cat/cat /path-to-outfile/output`;
    [/LIST]

    I guess I am not sure of the "format" I need to enter for these vars.

    Perhaps a dummy example would be of help.

    Thanks so much.

  •  

    eelixduppy

    WebmasterWorld Senior Member eelixduppy us a WebmasterWorld Top Contributor of All Time 5+ Year Member



     
    Msg#: 3206080 posted 7:32 pm on Jan 2, 2007 (gmt 0)

    I'm not familiar with wget, but perhaps you should visit the documentation [gnu.org] for more information.

    You could also try a google search [google.com].

    alphacooler

    5+ Year Member



     
    Msg#: 3206080 posted 7:51 pm on Jan 2, 2007 (gmt 0)

    I'm not a complete n00b. I gave both of those a try before posting.

    mcavic

    WebmasterWorld Senior Member 10+ Year Member



     
    Msg#: 3206080 posted 7:59 pm on Jan 2, 2007 (gmt 0)

    Forget that script that you linked to. At first glance, it stinks badly.

    Just read the wget docmentation, trying calling it yourself from the command line to make sure you understand it, and then you can script it yourself using PHP's popen() function.

    alphacooler

    5+ Year Member



     
    Msg#: 3206080 posted 8:13 pm on Jan 2, 2007 (gmt 0)

    I tried to just download a simple image from a documentation example.

    exec("/usr/bin/wget http://example.com/jpg/flyweb.jpg");

    That should work, but doesn't.

    [edited by: coopster at 9:44 pm (utc) on Jan. 2, 2007]
    [edit reason] generalized url [/edit]

    eelixduppy

    WebmasterWorld Senior Member eelixduppy us a WebmasterWorld Top Contributor of All Time 5+ Year Member



     
    Msg#: 3206080 posted 8:22 pm on Jan 2, 2007 (gmt 0)

    >>That should work, but doesn't.

    If the connection is slow and the file lengthy then the download will most likely fail.

    Try something like this:

    exec("/usr/bin/wget --tries=45 http://example.com/jpg/flyweb.jpg");

    [edited by: coopster at 9:44 pm (utc) on Jan. 2, 2007]
    [edit reason] generalized url [/edit]

    mcavic

    WebmasterWorld Senior Member 10+ Year Member



     
    Msg#: 3206080 posted 8:28 pm on Jan 2, 2007 (gmt 0)

    exec("/usr/bin/wget http://example.com/jpg/flyweb.jpg");

    That should work, and save flyweb.jpg to the current directory. But if it fails, it won't tell you why. Also, the current directory might not be where you want it saved. Also, if you're running this under a Web server, it might not have permission to write to the current directory.

    [edited by: coopster at 9:44 pm (utc) on Jan. 2, 2007]
    [edit reason] generalized url [/edit]

    Global Options:
     top home search open messages active posts  
     

    Home / Forums Index / Code, Content, and Presentation / PHP Server Side Scripting
    rss feed

    All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
    Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
    WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
    © Webmaster World 1996-2014 all rights reserved