Just a dad with a sysadmin hobby … leaving reddit
TBH have you tried just basic git? There’s a web interface built into git itself and you can use ssh for your repositories. It’s simple and just works. If you need a faster web interface there’s also cgit. There’s no bells and whistles either. Just configure ssh, drop your repos in /srv and get to work.
If you need more that just standard basic git the. The other suggestions here are great especially forgjo!
I use backblaze b2 for my storage. I use restic to backup everything to it. It works well and I’ve had it going for YEARS at this point. For things I could never replaced, like photos, I use external drives in addition to B2. Everyone knows that if something happens and we need to leave to just grab the drive that is stuck to the wall and the family photos will be safe.
My though process goes like this, everything backups to my home server. I have snap shots of the data on a normal basis. So if I need to get something back, going to a snap shot is pretty simple. If for some reason my server(s) just stopped existing for some reason I could pull it back from B2. I’ve only had to actually restore from B2 a handful of times and it was worth it.
XMPP is fantastic IMHO
If you want to support a great project and have great uptime check out conversations.im
I don’t recommend self hosting something you want available all the time. That being said everyone has different needs/uses 😊
I’m wasn’t implying that you shouldn’t host it yourself at all. Just maybe use a VPS for hosting it yourself.
Getting buy in on the family & friends aspect is being able to match or exceed the popular free services. If there’s a perception that it’s not reliable then it’s highly unlikely they’ll keep using it. So the last thing you want is to have something happen to your internet connection, NAS, etc. At the end of the day it’s the pesky perception equals reality thing that dooms things like this and tanks the spouse approval factor.
Self hosting XMPP works well for most internal things. IMHO communication software that you’re relying on shouldn’t be hosted at home.
Both of those that you mentioned are great. I’ve used ejabberd in addition to that. I think prosody is better. Here’s a link to a list of more servers.
Another option since XMPP can do E2EE is use conversations.im it is my go to for XMPP hosting.
So correct me if I’m wrong.
You’re saying, in the case of git, that people using largely text based content over protocols nearly as old as the internet itself, just like the rest of the entire world, should not be allowed to do so?
Also what’s the RFC for secondary shitting streets? I must have missed that one.
Some things are, but the core functionality is easily extended through modules and you can find a lot of sites with them. Some of the best modules are going to cost you some money but it’s worth it.
That being said Odoo is overkill if you just need to send an invoice. Odoo is perfect for running a business though. It can and will do EVERYTHING and then some.
I know people who run their entire businesses on it. Website, HR, inventory, time clocks, billing, etc etc. it scales really well too. Largest business I know using it has 100s of employees. They even have a paid developer on staff that writes and maintains custom modules for them. Smallest business I know of using it has 5 employees, including the owners.
It’s seriously impressive software!
The synology stuff is neat but I personally wouldn’t use it. There’s a lot of stuff that is abstracted away from you and when you run into a problem it’s not easy to resolve. Plus you’re already running things that can do more.
If you want something like it casaos would be worth a look. You just take a base install of Debian 12 and run their script on it. You’ll get the ease of use that synology has without it fighting you when you want to do something different.
Once you have that going it’s just as simple as getting next cloud going and anything else you want. Which is just one click in the webui. It can manage all the containers you have running on the Fedora vm too. So your reverse proxy, blocky, etc shouldn’t be a problem to run on there.
Unless you REALLY want the synology apps and stuff like that. If that’s the case they go with xpenoloy.
I use RHEL/Rocky 8 for all my home server stuff mostly because i like my home server stuff boring and stable.
Since you were considering TrueNAS, maybe consider something like Debian/Ubuntu + CasaOS. That will give you a good base and webui to work with.
It’s not going to be a lean as it could be but it should give you enough guard rails and hand holding to get you started. Then you can figure out the rest of your needs from there. If you don’t like it you can always wipe it and try again with something else.
I’d stay away from the TrueNAS, Unraid, Proxmox, etc. mostly due to your hardware and that it’s your first home server.
They’re not bad at all, but a lot of the stuff is abstracted from you and since you’re more than likely going to want to tinker with it having a standardized base install with a distro that has a lot of documentation is going to be very helpful.
For sure!
What I’m talking about is perception of quality. If you know what you’re looking for then you’ll notice some of the artifacts. Especially in the darkest areas and when going from HDR to SDR.
It still looks better than steaming the same thing off of Netflix, Hulu, etc. So that’s all I need/want.
I fully realize there’s compromise there and if I want to view it in all its original glory I can bust out the bluray.
Yes, you’re completely correct. There’s something to consider though.
CPU encoding gives the best results possible, in terms of quality and size. Decoding, unless you have a very weak CPU, isn’t necessarily the bottleneck it most transcoding applications eg plex, jellyfin, etc.
So you can do things to make the media as streamable as possible for instance encoding your media in AV1 using the mp4 container rather than mkv. If you make it web optimized aka ATOM upfront it makes playing the file much easier and less resource intensive. Now when a client that can’t use AV1 requests it your transcode can do SW decode and HW encode. Not as efficient as pure HW but IMHO it’s a worthwhile trade off for the storage space you get in return.
You can make things more efficient by disabling subtitles and/or burnin on the media server side. If you have people like myself who need subs in everything then you can burn them in while you’re encoding the media to AV1 or only using formats like UTF8 so you can pass through them as m4v/mp4 doesn’t support subs like mkv does.
That’s essentially what the optimized versions do on Plex. Only it sticks with x264 rather than AV1.
If your media is only 720p then none of this would really make a difference for you. If you’re using 1080p+ rips then this will make a SIGNIFICANT difference. It’s made such a difference that I’ve started redoing my rips in 4K.
Unless that is you got a SAN in your closet and free electricity that is…
Yes it’s good, but with AV1 hanging about then you’re WAAAY better off using that over x265.
I re-encode all my stuff with AV1. It will take a 40GB x264 rip to 3-4GB. Where as with x265 It will be around 10-15GB.
It’s a significant difference in storage size and (as far as I can tell) no obvious difference in quality.
It wasn’t meant to be taken literally. What I mean by that is if you’re the type of person who enjoys the upkeep of something as critical (though maybe not so much theses days) as email then go ahead and host your own password vault service. I’m not saying it shouldn’t be done and couldn’t be done.
My point is that there’s going to be times where you NEED your password vault and having it be down because something happened at home or your VPS had a problem is a really shitty situation to be in.
Of course there’s work arounds and edge cases to everything too. For me planning and building for those possibilities came down to what can I do that is the most reliable, simple, and boring. Because that’s what most people need with anything that is critical.
IMHO much like backup, password storage should be reliable, simple, and boring. Kinda like flushing a toilet or flipping a light switch.
Having gone through all of these options I have thoughts.
Option 1 sounds awesome but will almost always leave you in a situation where you can’t get your logins when you need them in an emergency. You’re always depending on a chain of things. Depending on your situation it may not be a big deal. But this option sucks, imho.
Option 3 sounds amazing because it gives you the control of option 1 with the ease of option 2. But… unless you’re the kind of person that enjoys hosting their own email server you really don’t want this option. Fun in theory but not so much when you realize you now have a 3rd job.
So that leaves option 2. It’s great but you’re depending on someone else. This is the option that most people should choose too, imo. However it lacks some of control and trust that option 1 and 3 have.
Sooooo, that leaves us with option 4, the onion option. Breaking up your data into layers and using different tools for them.
So first and foremost I want my password storage to always be available. For me that means Bitwarden, (though I’m evaluating protonpass currently.) this is the outer layer. Things that can and should be stored here are stored here. I use it to manage web logins and 2FA tokens for those sites. I also use it for storing autofill data eg credit cards. I don’t use it to hold things like my gpg keys.
Next layer is pass. This layer is mostly things that I need to have logins or other information on headless/remote servers. Think self hosted lab services like a mariadb/postgres or backups. This is easily kept in sync with git. This is the layer where I’ll store things like gpg keys and other VERY sensitive data that I need to sync around.
For other things on this layer I use ansible vault. This is mostly used for anything where I need automation and/or I don’t want too or can’t easily use my yubikey for gpg. This is kept in sync with git as well.
Lastly the inner layer I use AGE or PGP. This is for anything else I can’t use the above for. So my Bitwarden export/backups are in this level too. I also use this layer for things that I need to use to bootstrap a system. Think sensitive dotfiles. This can be kept in sync with git as well.
Git is the best sync solution imo because you can store it anywhere and use anything to sync that repo. Just throw that raw repo on Dropbox, use ssh with it on a vps, rsync it, etc. you’ll always have it somewhere and on something.
My work flow goes like this Bitwarden -> Apple/Google/Firefox -> Pass -> Ansible -> AGE/PGP
This allows for syncing things as needed and how needed. It also gives you the option of having an encrypted text file if/when everything fails.
Neat idea, would be interesting if it used your own content from DLNA, Jellyfin, etc. The code looks simple enough that it should be possible to add a plug-in/provider for that stuff.
I think that it’s not going to have a long lifespan though. Being a simple and easy front end for various streaming sites might paint a target on it. Rightly or Wrongly.
I can see it having a longer life if it were to integrate some other technologies aside from the above like IPFS, BitTorrent, etc. The libraries to do that are already readily accessible eg LibP2P. Though that again might paint a bigger target.
Either way it’s definitely really neat and I’m sure a fun project to fork and explore if someone is feeling up to that kind of work.
4K and on my P2000 or using Intel QSV isn’t a great experience. I can totally see it not being a good experience on a P4000 too.
That being said with HDR work 1080 it works with both QSV and the P2000. So it should work like a champ on the P4000. I don’t really have any HDR displays so I don’t really grab that many things in HDR so YMMV.
The best advice I can offer is if the content is transcoded into a mp4 container with the ATOM upfront ( aka fast start / web) and you’re not using subtitles it will work okay-ish as long as you do not pause it. Using the mkv container is just asking for sadness in my experience with it. Though at this point if I need to do that I just transcode into AV1, burn the subs into it, and pass through the audio.
Mostly it comes down to data types, disk space, and restores. Even if you’re doing incremental backups with tar it isn’t as fast, space efficient, or easy to restore, (in most cases) as something like restic, borg, etc…
I have found when you just need something simple that just works everywhere then its hard to beat tar!
Restic, it has native S3 compatibility and when you combine with something like B2 it makes amazing offsite storage so you can enjoy the tried and true 3-2-1 backup strategy.
Also fedora magazine did a few posts on setting it up with systemd that makes it SUPER EASY to get going if you need a guide.
I have an ansible role that configures it on everyone’s laptops so that they have local, NAS, and remote, B2, backup locations.
Works like a charm for the past 8+ years.