My dad also used to self-host. Now I run all of the services he used to, and more.
Anyway, his server is still around so I thought I might as well use it for an offsite backup box, I run a matrix instance and nextcloud, as well as other things. But those two are the contents of which matter most.
How would you set this up?
Just a nightly rsync over sftp? That seems ineffecient. Is there a best way ro do this?
You can connect your main server and backup server to a VPS with wireguard. The main server backs up proxmox vms and cts to a proxmox backup server on the backup server. Nextcloud data can be backed up with something like duplicati encrypted over sftp to the backup server. Only hiccup about backing up Nextcloud is you should put it into maintenance mode first. You can write a script pre duplicati backup and post backup to enable and disable maintenance mode.
VPS is worth considering. I run no vms.
Duplicati seems like a good option.
Aware of maintenance mode. Thanks!
wireguard tunnel, carve off an lv on his network, expose it as an ataoe/iscsi/whatever device over wg, connect it on your end, write a luks volume to it. periodically connect, unlock, mount, rsync all your shit to it, unmount, lock, disconnect?
or just create a gpg’d backup set and rsync it over wg? or just rsync over wg if you dont give a shit about encryption or anything.
Tell us a bit about your environment! Are you all linux or do you have Windows as well? Are you running a hypervisor like Proxmox or VMWare or using containers? Are you just making complete backups, or can you forsee yourself needing granular file restores? There are a number of ways you could go, depending on your setup.
I personally run a ProxMox cluster. I run both Windows and Linux servers. I perform local full-VM backups using the hypervisor to a USB disk. That gives me a fast way to restore VMs if I need to. I also run Veeam, which handles the offsite copy and provides granular file restores. It’s nice because the community edition supports hardened disk immutability, which can help prevent ransomware attacks and Unfortunate Incidents. That just runs over SSH, and installs a Veeam agent/repo on the remote linux box.
All linux.
Mostly running stuff directly, though I have some things in containers.
I’ve consolidated configs and such into just a few folders that I can bring over to a new system to get everything running again without losing anything.
This backup will likely only ever be needed in a catastrophic failure scenario where my local system is entirely lost.
Unless that happens, I already have enough redundancy locally to recover from any lesser mishaps.
If you are all linux, perhaps you’d like Borg. It’s pretty easy to set up, and perhaps it offers the de-duplication you crave. It’s also got some GUIs you can run, if you are into that kind of thing. If you don’t feel inclined to use any backup software, then using rsync over ssh (scp, yes) instead of SFTP is a solid way to go.
Good options. Thanks!