• 0 Posts
  • 384 Comments
Joined 1Y ago
cake
Cake day: Jun 19, 2023

help-circle
rss

You can also delegate a subdomain to another provider with an API, but yes I see what you mean. Although I feel like getting port 80 open would be difficult as well in those situations.



I’d say they’re actually easier, at least in my experience. Since wildcard certs use DNS-01 verification with an API, you don’t need to deal with exposing port 80 directly to the internet.


You shouldn’t have the do anything specific at all, local network stuff works without internet and Jellyfin doesn’t rely on any internet servers like Plex does for authentication.


Odd, I’ve had a Pixel, Oneplus 7 pro, and now a Galaxy S21 and they all pick up my DNS server from DHCP without any issues.


If you have private DNS turned off it doesn’t, unless maybe you have some manufacturer specific weirdness going on with extra software.


Does a PC connected to the same wifi network as the phone get the proper DNS servers and work like it should?


Strange, have you checked the interface info on Android to see what DNS info it’s getting from the DHCP server?

Also check that it’s getting an IP on the 192.168.x.y network, and not some other subnet if the AP is doing funky things.


Do you have private DNS enabled on Android? That would use a public DNS server by default regardless of what DHCP configures.

Also check your browsers, some have their own DNS settings.


Frigate has been great, I’ve run it for years now.

Using OpenVINO on my Intel iGPU for hardware accelerated object detection and encode/decode.


The tunnels are encrypted. But I don’t know if they use SSL or something else.


I looked around awhile ago and didn’t really find anything good.

I think the best option is a raspberrypi and one of those 12-15" portable HDMI monitors.


Your router doesn’t handle LAN traffic so an upgrade shouldn’t make any difference, unless you have multiple VLANs and are passing traffic between them and don’t have a Layer 3 switch in use to handle inter-VLAN routing.

I would probably start with an iperf test for download bandwidth to the Pi from the server. If that looks OK then I would benchmark the NFS share for read speed on the Pi, make sure that’s not doing something weird.

If that all looks good then I would probably suspect that Kodi either isn’t using hardware acceleration properly, or the specific media codec is not supported by the Pi for hardware acceleration.


Security for a full blown web app is not trivial and has a bigger “attack surface” than a kdbx file moving p2p through my devices via syncthing.

Absolutely.

My Vaultwarden instance is only accessible via LAN or VPN though, I don’t think I’d want to expose it to the internet.


Not really, it uses some GPU power when it’s actively generating a response, but otherwise it just sits idle.


The other handy reason to keep torrent files around is you can use it to verify the data you have isn’t corrupted or changed in some way.


DNS is only used initially on first load, after that the connection is made via IP and DNS isn’t used.



Honestly just replace the CMOS battery on a schedule if it’s a big deal, a UPS is nice to have but it doesn’t really solve that specific problem.


Some BIOS manufacturers allow you to disable all halts on errors.

That will be reset to default if the CMOS battery is dead and power is removed though.


Server hardware will reset CMOS if the battery goes dead too.


You’re not really at risk of DDOS in that case, I wouldn’t worry about it.



Oooh yeah I can imagine RAIDz2 on top of using spinning disks would be very slow, especially with access times enabled on ZFS.

What backup software are you using? I’ve found restic to be reasonably fast.


You start the backup, db is backed up, now image assets are being copied. That could take an hour.

For the initial backup maybe, but subsequent incrementals should only take a minute or two.

I don’t bother stopping services, it’s too time intensive to deal with setting that up.

I’ve yet to meet any service that can’t recover smoothly from a kill -9 equivalent, any that did sure wouldn’t be in my list of stuff I run anymore.



They have proprietary motherboard and psu, so I’d just look at doing a full diy build maybe.


Dell/HP SFF? 7th-9th gen CPUs, super cheap, quiet, should idle at 10-15W.

Only issue is 4x 3.5" drives for sure won’t fit, you need a pretty unique case to do that in mini-ITX size I think

Not sure which ones have NVMe slots, would have to research that.


Many DNS providers have an API and are supported by various dynamicDNS clients. I use Cloudflare and the built in client on my Opnsense router.

OpenWRT should have a client too that supports a bunch of services.


It just works and it’s in every distros default repo, it’s pretty easy to set up and can be a webserver for static files, PHP sites, etc… It can be a reverse proxy for HTTP(s) traffic or just forward TCP/UDP.

There’s also endless documentation out there for how to do something in nginx.

HAProxy is a nightmare to use in my experience. It just feels so clunky and old.

Caddy is nice, but downloading and updating it is a pain because you need modules that aren’t included in the repo version.



You could, but it’s easier to just disable the map feature in Immich if you don’t want to use it.


Yeah that would be a nice feature to see. The mobile app is sometimes a little buggy loading photos on my phone too, it will be slow to load like it’s pulling from the server even though the photos are also locally on the phone.


Grab docker desktop, then I think you should just be able to follow the Linkwarden install docs. It’s been awhile since I’ve used docker on windows though.


Is immich in a usable state yet?

I’ve been using it for 388 days (as helpfully shown by the new buy button, nice touch), and it’s been stable and rock solid the entire time.

I’ve had a few times it went offline, due to the breaking changes in the docker compose file because I auto-update everything, but it’s always been like a 2 minute fix and it’s back online.

Everything is backed up on my server nightly with incremental backups, both locally and online. So I’m not really worried about something going catastrophically wrong and deleting all my photos or something.

(just point to a folder and you’re good to go)

Immich has that in external library support, it’s pretty easy to set up.


I think disabling by default and having a clear explanation of what enabling it involves is good.

Maybe in the initial account creation/onboarding on a new instance, have it ask if server wide maps should be enabled using the default provider, with clear text about what that involves.

The option to use other providers sounds good too.



Fair, it does depend on what games you’re hosting. I often have multiple servers for different games running and some can use upwards of 10GB of RAM each when in use.

Highest I’ve had I think was an Avorion server that hit around 20GB of RAM usage with 5 or so players on.

I find that VPS cores are often very low performance cores, since they want high core density in their servers vs fewer high performance cores, and for games like Arma 3, Minecraft, Enshrouded, etc they really need high single thread performance to work well.


For sure anything with private data involved, aside from my email.

So everything to do with images, videos, file/document storage, etc…

Also game servers because they’re generally very easy to host at home, and due to generally high RAM and storage needs paying for hosting can be quite pricey.


There’s essentially no overhead with containers. Performance is almost identical to bare metal in most cases.