I had a workflow a few years ago where I ran and configured a local drupal instance, then ran this HTTrack tool which would export all the pages and images to flat HTML which I then zipped and pushed to an S3 bucket to host the website. Worked great because it just needed to host info, no comments or accounts or anything.
I had it open for a web server for 2.5 years because I was lazy and my IP changed a lot and I traveled and didn’t have a VPN setup and never had any issues as far as I could tell. Disabled password and root auth but was also fine with wiping that server if there were issues. It’s certainly not recommended but isn’t immediately always going to be an issue
Does the app support push notifications? Would be interested in this but I already use tasks.org since they support push notifications and I won’t take the trash out until right before bed instead of before it gets dark otherwise.
Once you learn it it isn’t super crazy but takes a lot of effort obviously. I think most people who do use k3s and k8s at home are people who use it for work so already knows how and where things should work and be. That said I work with kubernetes every day for work managing a handful of giant production clusters and at home I use unraid to keep it simple.
I have an all Ubiquiti setup and only use local accounts for everything. UDM Pro, 2 8 port switches and 2 APs, U6Mesh and another older AP. One of my accounts had me turn on MFA but every device still let’s me use a local account with a password and ssh key. Do you know what devices are forcing that?
Hey! Finally gave it a go this morning but ran into some headaches pointing to existing dockerized mysql and postgres containers on unraid. I reached out in discord this afternoon but setting up auth according to the docker-compose on the site and github I get lots of errors about missing tables or properties during the database initialization.
Good to see this works with antennapod, just need to get a gpodder thing setup in my unraid server and give this a go. Been using antennapod on my phone for the last several years but didn’t do backups and exports often enough and when my Samsung dropped and died I lost 8 months of data. This would make it a lot easier to also be able to stream on my desktop during work. Will be giving it a go here soon!
Been trying to read through to understand and see how all this is supposed to work, I guess it’s so you can use beeper app and infra and APIs to talk to your matrix server and the encryption/decryption/handshake happens here between matrix and beeper and then send to their servers for delivery and all that portion.
They do have a doc for this. https://immich.app/docs/administration/backup-and-restore/
I dump my immich db weekly and every 2 weeks sync the media folders to a remote destination
I think the simplest setup is keeping all the apps and services on the local network and doing something like this guide so they are always behind a VPN. Then setup another VPN on unraid or another device to access from outside the local network. There are plenty of other guides for unraid and Plex and the arr stack out there, unraid is just what I use but can use whatever OS you would prefer.
https://unraid-guides.com/2021/05/19/how-to-route-any-docker-container-on-unraid-through-a-vpn/
I use Kavita and KavitaEmail to organize and have a frontend for my books, and the latter to email them to my kindle if it’s not on there yet. My kavita container is stopped most of the time because I already know what I’m going to read next and just need it up to sync or send new books.
Used to just have my library I exported from Amazon and ebooks com on a single folder on my NAS, kavita helped clean it up a bit.
I also tried audiobookshelf but mostly for audiobooks and podcasts and didnt quite fit my workflow I already had and liked using kavita and Antennapod.
I have a cloudflare tunnel setup for 1 service in my homelab and have it connecting to my reverse proxy so the data between cloudflare and my backend is encrypted separately. I get no malformed requests and no issues from cloudflare, even remote public IP data in the headers.
Everyone mentions this as an issue, and I am sure doing the default of pointing cloudflared at a http local service but it’s not the ONLY option.
Anything that is actually helpful and useful helps to keep better in touch with aunts and cousins and my parents who all have iPhones and I miss out on group chats since they wont install or use anything else. Apple isn’t going to get rid of their garden unless forced so I’m glad someone is trying something.
My work environments use Prometheus and node-exporter and grafana. At home I use telegraf, influxdb and grafana (and Prometheus for other app specific metrics) but the biggest reason I went with telegraf and influxdb at home is because Prometheus scrapes data from the configured clients (pull), while telegraf sends the data on the configured interval to influxdb (push) and starting my homelab adventure I had 2 VMS in the cloud and 2 pis at home and having telegraf sending the data in to my pis rather than going out and scraping made it a lot easier for that remote setup. I had influxdb setup behind a reverse proxy and auth so telegraf was sending data over TLS and needed to authenticate to just the single endpoint. That is the major difference to me, but there are also subsets of other exporters and plugins and stuff to tailor data for each one depending on what you want.
I selfhost kavita for about 30 ebooks and use KavitaEmail to send epubs to my kindle. I also tried out audiobookshelf only for podcasts and wasn’t quite up to my current workflow that antennapod running only on my phone exceeds at. I also recently saw audiobookshelf can host epubs and send them via SMTP via one container so once the android app for the latter comes out of beta and has better local file and androidauto support I may give that a shot again.
I’m self hosting cloudflared right now, the TLS from cloudflare terminates in a container in my network and then goes to my reverse proxy container for my local network. I’m definitely going to poke around tailscale and their funnels for the future, I’m just playing devils advocate for those replying not knowing anything about cloudflare tunnels yet saying they’re the wrong choice.
Just my two cents I’d prefer my traffic going through Cloudflare vs Tailscale if it’s all the same, since I’ve heard a lot about Tailscale but know nothing. I’ve interacted on Github threads with people from cloudflare and they’re all super nice and their blog posts and post-mortems are very insightful. Was curious to see if people had actual insight but appears it’s just auto cloudflare = bad.
First I’m hearing of a V2, are there any threads on github or posts detailing this so I know what to look for?
Edit: just kidding, found the issue and milestone on github now