Welcome to WebmasterWorld Guest from 54.145.44.134

Forum Moderators: coopster & jatar k

Message Too Old, No Replies

WGET to download file

     

alphacooler

7:21 pm on Jan 2, 2007 (gmt 0)

10+ Year Member



Here is a script I found to download files using the commandline and WGET. But unfortunately the comments of the orig programmer aren't quite enough for me to understand.

Here is the script:

[URL="http://www.zend.com/tips/tips.php?id=91&single=1"]http://www.zend.com/tips/tips.php?id=91&single=1[/URL]

I know where the path to wget on my server (/usr/bin/wget), but i'm not sure what to put in the following:

[LIST]

  • $destination-path
  • $url (url of file I want to d/l presumably)
  • /path-to-outfile/output
  • $temp = `/path-to-cat/cat /path-to-outfile/output`;
    [/LIST]

    I guess I am not sure of the "format" I need to enter for these vars.

    Perhaps a dummy example would be of help.

    Thanks so much.

  • eelixduppy

    7:32 pm on Jan 2, 2007 (gmt 0)

    WebmasterWorld Senior Member eelixduppy is a WebmasterWorld Top Contributor of All Time 5+ Year Member



    I'm not familiar with wget, but perhaps you should visit the documentation [gnu.org] for more information.

    You could also try a google search [google.com].

    alphacooler

    7:51 pm on Jan 2, 2007 (gmt 0)

    10+ Year Member



    I'm not a complete n00b. I gave both of those a try before posting.

    mcavic

    7:59 pm on Jan 2, 2007 (gmt 0)

    WebmasterWorld Senior Member 10+ Year Member



    Forget that script that you linked to. At first glance, it stinks badly.

    Just read the wget docmentation, trying calling it yourself from the command line to make sure you understand it, and then you can script it yourself using PHP's popen() function.

    alphacooler

    8:13 pm on Jan 2, 2007 (gmt 0)

    10+ Year Member



    I tried to just download a simple image from a documentation example.

    exec("/usr/bin/wget http://example.com/jpg/flyweb.jpg");

    That should work, but doesn't.

    [edited by: coopster at 9:44 pm (utc) on Jan. 2, 2007]
    [edit reason] generalized url [/edit]

    eelixduppy

    8:22 pm on Jan 2, 2007 (gmt 0)

    WebmasterWorld Senior Member eelixduppy is a WebmasterWorld Top Contributor of All Time 5+ Year Member



    >>That should work, but doesn't.

    If the connection is slow and the file lengthy then the download will most likely fail.

    Try something like this:


    exec("/usr/bin/wget --tries=45 http://example.com/jpg/flyweb.jpg");

    [edited by: coopster at 9:44 pm (utc) on Jan. 2, 2007]
    [edit reason] generalized url [/edit]

    mcavic

    8:28 pm on Jan 2, 2007 (gmt 0)

    WebmasterWorld Senior Member 10+ Year Member



    exec("/usr/bin/wget http://example.com/jpg/flyweb.jpg");

    That should work, and save flyweb.jpg to the current directory. But if it fails, it won't tell you why. Also, the current directory might not be where you want it saved. Also, if you're running this under a Web server, it might not have permission to write to the current directory.

    [edited by: coopster at 9:44 pm (utc) on Jan. 2, 2007]
    [edit reason] generalized url [/edit]