Recently I was looking at backup options in Linux.
Of course with all the cloud craze nowadays, it is quite easy to just let someone else worry about the setup and sync your files with an external service for free or very cheap. Needless to mention the convenience of accessing said files from anywhere and sharing them with people.
There are cases in which I prefer a local setup, though.
- Security is probably handled better in online services, but it still feels different to have your files 'out there'.
- Large files take long to sync, so it is not always convenient.
- Some files are simply not needed outside the 'home context'.
Tools like grsync are great, but did not quite satisfy my needs. One of the main shortcomings is that each 'session' is a single pair of source and destination paths.
Perhaps in a perfect setup all files that you want to backup are in a single location. This is not the case in reality. There are things you want to backup all over the place - mail and dotfiles in your home directory, media in another (perhaps on a separate partition or even different disk), documents in a third, etc.
OK, maybe I'm just sloppy with my files.
I started playing with a Python script that would call rsync and backup files on a USB hard drive. It is not much more complex than a shell script, but allows some more flexibility.
At around the same time came out the quad core, 1 gig RAM Raspberry Pi Model B, and I thought it would make a wonderful home server. Its most important function would be backup, but there were other possibilities.
The idea started to form. The Pi would be a little more than a storage room:
- Files are backed up on a hard drive through rsync.
- Backed up media can be played from another device with dlna. Then the Pi can be connected to an audio amp and act as the audio renderer.
- Central git repository for all code. This might be backed up separately. At first just a collection of bare repos.
- A web server can run small local services like wikis, but also owncloud.
- The web server can act as a staging area to test web projects.
In this tutorial I am describing the setup that I made, with some possible tweaks. However you might want to extend it, for instance to sync specific directories to the cloud, etc.
The server in this setup was a Raspberry Pi running Raspbian. Go ahead and set up the Pi if you haven't.
However, this guide should work with any computer running *nix with minimal adjustments (e.g. paths).
If you are using Raspberry Pi and intend to backup on a hard drive, have in mind that you need a powered USB hub, as the Pi's ports do not provide enough power.
I looked around and found scripts such as this one, but it was still not exactly what I wanted. I decided to reuse my python script and get it to rsync remotely to the rPi.
The rPi needs to be configured to run the various daemons, provide backup directories, etc.
Using rsync in daemon mode on the Pi seemed like the way to go, but turned out slightly trickier than I thought (See Sever config for more on this).
So here we go: