implementing a download manager that supports resuming

Posted by Idan K on Stack Overflow See other posts from Stack Overflow or by Idan K
Published on 2009-04-04T16:08:16Z Indexed on 2010/05/14 12:14 UTC
Read the original article Hit count: 349

Filed under:
|
|

hi,

I intend on writing a small download manager in C++ that supports resuming (and multiple connections per download).

From the info I gathered so far, when sending the http request I need to add a header field with a key of "Range" and the value "bytes=startoff-endoff". Then the server returns a http response with the data between those offsets.

So roughly what I have in mind is to split the file to the number of allowed connections per file and send a http request per splitted part with the appropriate "Range". So if I have a 4mb file and 4 allowed connections, I'd split the file to 4 and have 4 http requests going, each with the appropriate "Range" field. Implementing the resume feature would involve remembering which offsets are already downloaded and simply not request those.

  • Is this the right way to do this?
  • What if the web server doesn't support resuming? (my guess is it will ignore the "Range" and just send the entire file)
  • When sending the http requests, should I specify in the range the entire splitted size? Or maybe ask smaller pieces, say 1024k per request?
  • When reading the data, should I write it immediately to the file or do some kind of buffering? I guess it could be wasteful to write small chunks.
  • Should I use a memory mapped file? If I remember correctly, it's recommended for frequent reads rather than writes (I could be wrong). Is it memory wise? What if I have several downloads simultaneously?
  • If I'm not using a memory mapped file, should I open the file per allowed connection? Or when needing to write to the file simply seek? (if I did use a memory mapped file this would be really easy, since I could simply have several pointers).

Note: I'll probably be using Qt, but this is a general question so I left code out of it.

© Stack Overflow or respective owner

Related posts about http

Related posts about download