• 0 Posts
  • 51 Comments
Joined 1Y ago
cake
Cake day: Jun 13, 2023

help-circle
rss

The request from the other machines go through the firewall and are being redirected, the requests from the NAS are basically trying to connect to localhost, so no redirection here as the requests aren’t leaving the machine.


Wait, you update productions systems without running a staging environment? Or even checking the update notes and your installed apps? Also no backups? What kind of business are you running over there?


If setting up TLS is too much work, better stay with a service. Signal is nice.


It really depends on if you need transcoding or not. If no, it doesn’t matter. If yes, check for integrated GPUs on both models and check that it will work as a transcoder for jellyfin.



The idea is that the router plugs in to your home internet and the server into the router. Between the two they get the server able to handle incoming requests so that you can host services on the box and address them from the broader Internet.

Why would I need a separate router for that? I’d need to configure the main router anyway.





The other bad news: there are so many vulnerabilities on all systems which can be used to gain root-level access, it’s just a matter of time. Also, even future vulnerabilities will be an issue, as the underlying Sinkclose attacks will still work.



Ah. Personally I’d do the mounting via fstab to get a consistent path.



I am unsure about sustainability

In what regard?


Apart from the world of trouble you might get yourself into when doing such things on secured systems, why are you going at it in such a complicated way?

Why not simply use a self hosted file/document storage and sharing solution like Nextcloud or Pydio Cells or something like that? Reachable through standard HTTP(S), which is a lot easier to reach than most other protocols.


I’m thinking 25DB is a hard cap, ideally under 20DB.

I think HDDs are typically around 5-10DB,

Um no. More like 20-25db at idle, up to 30 during heavy seek activity, depending on model.

I run 3x 5400rpm drives in my NAS, and the drives are definitely the loudest parts in the whole build, and are definitely noticeable in the office room.


Define “very quiet”? Because that’s going to be tricky with spinning rust, depending on your noise tolerance.


Ideally he’s needing something great for video encoding, and Linux friendly to boot.

What’s the plan here, using the laptop for gaming and streaming, or only using the laptop as the streaming machine?


Not supported. Best case, it simply works in non-ECC mode, worst case, it won’t boot.


That depends on if you trust them. Also, would this be your only backup?


There are some things that are easier to see and check in Portainer, but for pure compose handling (up, down, logs) dockge works really well.


I run a mixed setup, many of the “less important” containers are on watchtower auto-update, the rest on notification (reverse proxy, Nextcloud, etc).

But I also have many of them on specific branches instead of “latest”.


Any special requirement for Alpine, or just “because I want to”?


I’d be a bit concerned with having the git repo also be hosted on the machine itself.

Please tell me you have a tested backup solution/procedure in place.


Do you have a server, connection and domain available?

If yes, a simple Joomla setup with a single static page should work well.


Nice haul! I hope you also managed to get the small power plant for the drives. That’s not going to be pretty.


That’s most likely the syslog. Check the settings, you can choose the volume to use for it.


Which is one of the occasions that a Dev sticks to the original feature list instead of trying to shoehorn in some features which wouldn’t really fit.


While I do love Syncthing, it solves a different set of requirements.


Grafana + Prometheus + data gathering will at least give you the resource and usage stats.


Subtitles being “burned into” the frames instead of being a separate track, also known as hard-coded sometimes. This enables one to use subtitles on devices which cannot traditionally use them or screw up the display. But this means the server needs to re-encode each and every frame, which is a massive load on the server.


Considering it’s basically just a script “frontend”: wireguard and its documentation.


That setting also takes host names. As long as both containers share at least one network, put in the service name (not the container_name!), e.g. “npm” or whatever yours is called and you should be fine.


What are you using as a database? Also, from which Gitea version to which Forgejo version did you attempt the migration?


Those are two big points though, especially the latter. At least for people concerned with selfhosting their services.


It’s the way I do all of my service backups. One separate DB container per docker stack so nothing else is in there, all the data in one folder, and off we go at 1AM.


You’re building a machine from parts. Different Mainboard, PSU, CPU and drive combinations. No real way for a system manufacturer to optimize for a specific use case and power target.

The full ATX PSU alone has a widely variable efficiency, and often not down in the low double digit power areas.


You’re not going to see similar power levels to those ARM devices. A massive part will be the drives, 3.5" drives take between 6 and 10W per drive while running.