How can one use online backup with large amounts of static data?

Posted by Billy ONeal on Super User See other posts from Super User or by Billy ONeal
Published on 2010-10-29T22:45:19Z Indexed on 2011/01/13 19:55 UTC
Read the original article Hit count: 234

Filed under:

I'd like to setup an offsite backup solution for about 500GB of data that's currently stored between my various machines. I don't care about data retention rates, as this is only a backup of, not primary storage, for my data. If the backup is stored on crappy non-redundant systems, that does not matter.

The data set is almost entirely static, and mostly consists of things like installers for Visual Studio, and installer disk images for all of my games.

I have found two services which meet most of this:

  • Mozy
  • Carbonite

However, both services impose low bandwidth caps, on the order of 50kb/s, which prevent me from backing up a dataset of this size effectively (somewhere on the order of 6 weeks), despite the fact that I get multiple MB/s upload speeds everywhere else from this location. Carbonite has the additional problem that it tries to ignore pretty much every file in my backup set by default, because the files are mostly iso files and vmdk files, which aren't backed up by default.

There are other services such as EC2 which don't have such bandwidth caps, but such services are typically stored in highly redundant servers, and therefore cost on the order of 10 cents/gb/month, which is insanely expensive for storage of this kind of data set. (At $50/month I could build my own NAS to hold the data which would pay for itself after ~2-3 months) (To be fair, they're offering quite a bit more service than I'm looking for at that price, such as offering public HTTP access to the data)

Does anything exist meeting those requirements or am I basically hosed?

© Super User or respective owner

Related posts about backup