How to back up a database with thousands of files

Posted by Neal on Server Fault See other posts from Server Fault or by Neal
Published on 2012-07-04T13:40:03Z Indexed on 2012/07/04 15:17 UTC
Read the original article Hit count: 237

Filed under:
|

I am working with a Fedora server that runs a customized software package. The server software is quite old, and its database consists of 1,723 files. The database files are constantly changing - they continually grow and changes are not necessarily appended to the end. So right now, we currently back up every 24 hours at midnight when all users are off of the system and the database is in an internally consistent state.

The problem is that we have the potential to lose an entire day's worth of work, which would be unrecoverable. So I'd like to know if there is a way to take some sort of an instantaneous snapshot of these database files that we could back up every 30 minutes or so.

I've read about Linux LVM snapshots, and am thinking that I might be able to do accomplish the goal by taking a snapshot, rsync'ing the files to a backup server, then dropping the snapshot. But I've never done this before,so I don't know if this is the "right" fix.

Any ideas on this? Any better solutions?

Thanks!

© Server Fault or respective owner

Related posts about linux

Related posts about database-backup