Welcome to WebmasterWorld Guest from 18.104.22.168
I'm particularly curious about the use of "Download Managers" and accelerators - that claim to speed up download times and in particular offer the possibility to resume broken downloads.
Surely if people are downloading accross http they can't resume anyway?
Has anyone come accross a reliable way of delivering such large files? - in the broadband era I guess it can't be such an unusual wish... even if complicated to do reliably.
> Surely if people are downloading accross http they can't resume anyway?
Yes, they can, if the download manager keeps track of the last 'piece' of the file successfully downloaded, and saves this information on the client machine. HTTP/1.1 supports 'partial content' requests, so that a client can request, for example, byte 1,032,468 through byte 1,094,762 of a file from the server. If this request is sucessful, you'll see a 206-Partial Content response in your server log file.
See RFC2616 "Range Units [w3.org]" and other sections linked from there.