Welcome to WebmasterWorld Guest from 184.108.40.206
I've done a bit of image uploading before and although it's not easy, it's not really hard either right?
Has anyone done this though: Set it up so that you could copy the image url (from another server) and have the script download the image and treat it like it was uploaded?
With me? - I'm building affiliate db's and need to automate the process of uploading product photos as much as possible.
Many thanks, hope it's understandable!
I don't know what to recommend, I just know it's not fast :(
I suppose you could use wget from the command line...
[edited by: jatar_k at 8:12 pm (utc) on June 3, 2003]
[edit reason] delinked [/edit]
will get the file at the specified url and dump it into your current directory. to specify a directory and name, use
wget --output-document=<filename> <url>
You can also use --limit-rate=20kb for example to limit the max download rate at 20kb if you dont want to paralyze your server or the remote server. There are also options for specifying lists of files and such, just use the man page.
Also, php can directly download files on it's own and also write them:
fopen works the same on remote urls as local files if you are reading. Works for images just fine. Now the trick then becomes finding all the image urls.
imagecreatefromgd2 -- Create a new image from GD2 file or URL
imagecreatefromgd2part -- Create a new image from a given part of GD2 file or URL
imagecreatefromgd -- Create a new image from GD file or URL
imagecreatefromgif -- Create a new image from file or URL
imagecreatefromjpeg -- Create a new image from file or URL
imagecreatefrompng -- Create a new image from file or URL
imagecreatefromstring -- Create a new image from the image stream in the string
imagecreatefromwbmp -- Create a new image from file or URL
imagecreatefromxbm -- Create a new image from file or URL
imagecreatefromxpm -- Create a new image from file or URL