If you’ve got a copy of the data that’s local, why are you opening up ports? Just run the backup job internally.
I’m often not at home for weeks at a time.
but man do I not trust a USB interface at all.
Trust?
I also recommend not relying on email for notifications - too unreliable. I use the healthchecks.io docker image and have it send me notifications via Pushover when something fails.
My backup game is pretty bad, I only have my primary copy of my data and a cloud storage copy. I was trying to think of a cheap way to have another backup, and then realized I have an Orange Pi Zero 2 and a 1TB USD SSD lying around. So I was thinking of:
- installing Debian on the OPZ2, and setting up key-authenticated SFTP (no password auth)
- connect the OPZ2 on my home network and expose a non-standard (e.g. not 22) port for SFTP
- have a subdomain point to my home network ip, and use DDNS to keep it in sync
- using Restic to remotely push password-encrypted backups to the OPZ2 via SFTP using the subdomain
- set a cron job to check diskhealth and send myself email on bad
- enable auto updates on debian and email on fail
Is this setup a bad idea? Is this a security nightmare? Any better suggestions?
Heya, I'm trying out Lemmy and kinda like the idea of hosting a Lemmy instance just for me.
I was wondering:
- What are the hardware/bandwidth requirements for a single user instance?
- I know different instances can black list each other, but can they whitelist each other too? I don't want to be automatically unable to see interact with certain instances.
- Has anyone else done this and have thoughts to share?
- What about doing the same for Mastodon?
I’m often not at home for weeks at a time.
Trust?
I’ll look into this thanks!