Basically title. I’m in the process of setting up a proper backup for my configured containers on Unraid and I’m wondering how often I should run my backup script. Right now, I have a cron job set to run on Monday and Friday nights, is this too frequent? Whats your schedule and do you strictly backup your appdata (container configs), or is there other data you include in your backups?

@madame_gaymes@programming.dev
link
fedilink
English
2
edit-2
2d

I’m always backing up with SyncThing in realtime, but every week I do an off-site type of tarball backup that isn’t within the SyncThing setup.

@Darkassassin07@lemmy.ca
link
fedilink
English
32
edit-2
4d

I run Borg nightly, backing up the majority of the data on my boot disk, incl docker volumes and config + a few extra folders.

Each individual archive is around 550gb, but because of the de-duplication and compression it’s only ~800mb of new data each day taking around 3min to complete the backup.

Borgs de-duplication is honestly incredible. I keep 7 daily backups, 3 weekly, 11 monthly, then one for each year beyond that. The 21 historical backups I have right now RAW would be 10.98tb of data. After de-duplication and compression it only takes up 407.98gb on disk.

With that kind of space savings, I see no reason not to keep such frequent backups. Hell, the whole archive takes up less space than one copy of the original data.

@FryAndBender@lemmy.world
link
fedilink
English
3
edit-2
4d

+1 for borg


                   Original size      Compressed size    Deduplicated size

This archive: 602.47 GB 569.64 GB 15.68 MB All archives: 16.33 TB 15.45 TB 607.71 GB

                   Unique chunks         Total chunks

Chunk index: 2703719 18695670

Sips'
creator
link
fedilink
English
54d

Thanks for sharing the details on this, very interesting!

I do not as I cannot afford the extra storage required to do so.

SavvyWolf
link
fedilink
English
124d

Daily backups here. Storage is cheap. Losing data is not.

Once every 24 hours.

@IsoKiero@sopuli.xyz
link
fedilink
English
44d

Yep. Even if the data I’m backing up doesn’t really change that often. Perhapas I should start to back up files from my laptop and workstation too. Nothing too important is stored only on those devices, but reinstalling and reconfiguring everything back is a bit of a chore.

@desentizised@lemm.ee
link
fedilink
English
34d

rsync from ZFS to an off-site unraid every 24 hours 5 times a week. on the sixth day it does a checksum based rsync which obviously means more stress so only do it once a week. the seventh day is reserved for ZFS scrubbing every two weeks.

slazer2au
link
fedilink
English
184d

Backups???

metaStatic
link
fedilink
44d

Raid is a backup.

slazer2au
link
fedilink
English
334d

That is what the B in RAID stands for.

@AtariDump@lemmy.world
link
fedilink
English
64d

Just like the “s” in IoT stands for “security”

🤣

Avid Amoeba
link
fedilink
English
44d

What’s the second B stand for?

@meyotch@slrpnk.net
link
fedilink
English
44d

Beets.

Or bears.

Or buttsex.

It’s context dependent, like “cool”.

metaStatic
link
fedilink
34d

cool

Avid Amoeba
link
fedilink
English
5
edit-2
4d

If Raid is backup, then Unraid is?

battlesheep
link
fedilink
English
34d

Backup all of my proxmox-LXCs/VMs to a proxmox backup server every night + sync these backups to another pbs in another town. A second proxmox backup every noon to my nas. (i know, 3-2-1 rule is not reached…)

slax
link
fedilink
English
34d

I have

  • Unraid back up it’s USB
  • Unraid appears gets backed up weekly by a community applications (CA app backup) and I use rclone to back it up to an old box account (100GB for life…) I did have it encrypted but seems I need to fix that…
  • Parity drive on my Unraid (8TB)
  • I am trying to understand how to use Rclone to back up my photos to Proton Drive so that’s next.

Music and media is not too important yet but I would love some insight

@zero_gravitas@aussie.zone
link
fedilink
English
4
edit-2
4d

Right now, I have a cron job set to run on Monday and Friday nights, is this too frequent?

Only you can answer this. How many days of data are you prepared to lose? What are the downsides of running your backup scripts more frequently?

@ikidd@lemmy.world
link
fedilink
English
10
edit-2
4d

Proxmox servers are mirrored zpools, not that RAID is a backup. Replication between Proxmox servers every 15 minutes for HA guests, hourly for less critical guests. Full backups with PBS at 5AM and 7PM, 2 sets apiece with one set that goes off site and is rotated weekly. Differential replication every day to zfs.rent. I keep 30 dailies, 12 weeklys, 24 monthly and infinite annuals.

Periodic test restores of all backups at various granularities at least monthly or whenever I’m bored or fuck something up.

Yes, former sysadmin.

This is very similar to how I run mine, except that I use Ceph instead of ZFS. Nightly backups of the CephFS data with Duplicati, followed by staggered nightly backups for all VMs and containers to a PBS VM on a the NAS. File backups from unraid get sent up to CrashPlan.

Slightly fewer retention points to cut down on overall storage, and a similar test pattern.

Yes, current sysadmin.

@ikidd@lemmy.world
link
fedilink
English
1
edit-2
2d

I would like to play with ceph but I don’t have a lot of spare equipment anymore, and I understand ZFS pretty well, and trust it. Maybe the next cluster upgrade if I ever do another one.

And I have an almost unhealthy paranoia after see so many shitshows in my career, so having a pile of copies just helps me sleep at night. The day I have to delve into the last layer is the day I build another layer, but that hasn’t happened recently. PBS dedup is pretty damn good so it’s not much extra to keep a lot of copies.

Lucy :3
link
fedilink
English
64d

Every hour, automatically

Never on my Laptop, because I’m too lazy to create a mechanism that detects when it’s possible.

@thejml@lemm.ee
link
fedilink
English
44d

I just tell it to back up my laptops every hour anyway. If it’s not on, it just doesn’t happen, but it’s generally on enough to capture what I need.

I use Duplicati for my backups, and have backup retention set up like this:

Save one backup each day for the past week, then save one each week for the past month, then save one each month for the past year.

That way I have granual backups for anything recent, and the further back in the past you go the less frequent the backups are to save space

Depends on the system but weekly at least

I have a cron job set to run on Monday and Friday nights, is this too frequent?

Only you can answer that - what is your risk tolerance for data loss?

Create a post

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don’t control.

Rules:

  1. Be civil: we’re here to support and learn from one another. Insults won’t be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it’s not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don’t duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

  • 1 user online
  • 132 users / day
  • 558 users / week
  • 1.4K users / month
  • 3.89K users / 6 months
  • 1 subscriber
  • 4.17K Posts
  • 86.7K Comments
  • Modlog