I have a home server that I’m using and hosting files on it. I’m worried about it breaking and loosing access to the files. So what method do you use to backup everything?
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don’t control.
Rules:
Be civil: we’re here to support and learn from one another. Insults won’t be tolerated. Flame wars are frowned upon.
No spam posting.
Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it’s not obvious why your post topic revolves around selfhosting, please include details to make it clear.
Don’t duplicate the full text of your blog or github here. Just post the link for folks to click.
Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).
No trolling.
Resources:
Any issues on the community? Report it using the report flag.
Questions? DM the mods!
Veeam Agent going to a NAS on-site and the NAS is backed up nightly to IDrive because it’s the cheapest cloud backup service I could find with Linux support. It’s a bit slow, very CPU-bound, but it’s robust and their support is pretty responsive.
Most of the files are actually nextcloud so I get one more copy of files (not backup) on PC by syncing with nextcloud app
https://somedaysoon.xyz/posts/tech/backups/
But tl;dr of that:
I use proxmox server and proxmox backup server (in a VM 🫣) to do encrypted backups.
A raspberry pi has ssh access to PBS and it rsync all the files, and then uploads them to backblaze using rclone.
https://2.5admins.com/ recommended “pull” backups, so if someone hacks your server they don’t have access to your backups. If the pi is hacked it can mess with everything, but the idea is that is has a smaller attack surface (just ssh).
PS. If you rclone a lot of files to backblaze use https://rclone.org/docs/#fast-list , or else it will get expensive
dont overthink it… servers/workstations rsync to a nas, then sync that nas to another nas offsite.
Various different ways for various different types of files.
Anything important is shared between my desktop PC’s, servers and my phone through Syncthing. Those syncthing folders are all also shared with two separate servers (in two separate locations) with hourly, daily, weekly, monthly volume snapshotting. Think your financial administration, work files, anything you produce, write, your main music collection, etc… It’s also a great way to keep your music in sync between your desktop PC and your phone.
Servers have their configuration files, /etc, /var/log, /root, etc… rsynced every 15 minutes to the same two backup servers, also to snapshotted volumes. That way, should any one server burn down, I can rebuild it in a trivial amount of time. This also goes for user profiles, document directories, ProgramData, and anything non-synced on windows PC’s.
Specific data sets, like database backups, repositories and such are also generally rsynced regularly, some to snapshotted volumes, some to regulars, depending on the size and volatility of the data.
Bigger file shares, like movies, tv-shows, etc… I don’t backup, but they’re stored on a distributed GlusterFS, so if any one server goes down, that doesn’t lose me everything just yet.
Hardware will fail, sooner or later. You should see any one device as essentially disposable, and have anything of worth synced and archived automatically.
A simple script using duplicity to FTP data on my private website with infinite storage. I can’t say if it’s good or not. It’s my first time doing it.
How do you have infinite storage? Gsuite?
I confirm that in the terms and condition they discourage the use as a private cloud backup and only to host stuff related to the website. Now… until now I’ve had no complaints as I’ve been paying and kept the traffic at minimum. I guess I’ll have to switch to some more cloud oriented version if I keep expanding. But it’s worked for now !
If you are using kubernetes, you can use longhorn to provision PVCs. It offers easy S3 backup along with snapshots. It has saved me a few times.
Compressed
pg_dump
rsync’ed to off-site server.Borgbackup, using borgmatic as a frontend, to a storage VPS. I backup dozens of machines this way. I simply add a user account for each machine on the VPS, then each machine backs up over ssh to its own account.
Proxmox backs up the VMs -> backups are uploaded to the cloud.
You guys back up your server?
If your data is replaceable, there’s not much point unless it’s a long wait or high cost to get it back. It’s why I don’t have many backups.
In the 20 years that I’ve been running a home server I’ve never had anything more than a failed disk in the array which didn’t cause any data loss.
I do have backups since it’s a good practice and also because it familiarizes me with the software and processes as they change and update so my skillset is always fresh for work purposes.
Rsnapshot on a second server, saving 7 daily backups, 4 weekly backups, and 6 mk they backups
Same setup!
Additionally, about once a month I will grab a copy of the latest daily backup to an USB drive and store it away from the server
I’m lucky enough that my backup server is at my parent’s place I’m their basement, so it’s off-site by already
Zfs z2 pool . Not a perfect backup, but it covers disk failure (already lost one disk with no data loss), and accidental file deletion. I’m vulnerable to my house burning down, but overall I sleep well enough.
cronjobs with rsync to a Synology NAS and then to Synology’s cloud backup.