It really depends on the project. Some of them take breaking changes seriously and don’t do them and auto migrate and others will throw them out on “minor” number releases and there might be a lot of breaking changes but you only run into one that impacts you occasionally. I typically don’t want containers that are going to be a lot of work to keep up to date so I jettison projects that have unreliable releases for whatever reason and if they put out a breaking change its a good time to re evaluate whether I want that container at all and look at alternatives.
So no its not safe, but depending on the project it actually can be.
Ideally for your router you want something that runs an open source firmware (OpenWRT, DD-WRT, OPNSense, FreshTomato). Its better because you get a completely unlocked everything you need system with security patches for the hardware’s true lifetime. Every router company stops with the security updates after a few years and then at some point it becomes part of a bot net or one of this mass hack events. Its best not to play in that game and instead run some open source firmware from the outset.
The best way to start is to look at the website for openwrt.org and use their filtering to find a device that supports your needs (at least 5 LAN ethernet ports I guess and some wifi but AC sounds like it will do). The other option is a more typical 4 LAN port router which will give you a lot more options and then add a switch to that, doesn’t sound like you care too much about it being managed or >1gbps so they are also dirt cheap.
I don’t think modern Raspberry pi’s make much sense unless you are using GPIOs or really need the low power consumption. The 3 and the 4 were OK price wise but the pi 5 is quite close to all these N100 mini computers and they are a lot more performance and expansion compared to a raspberry pi 5 and still quite low power.
Either a Topton or similar N100 based machine or a mini PC second hand is the way to go at the ~$100 mark. The mini PC will be faster and probably more expandable and cheaper but also more power consumption.
It has been slowly improving. It used to be a lot worse but I have a lot less issues with it now than I did before all the changes. Its not the fastest best way to do anything, there are better calendar, file sync, email etc etc applications out there in every category that run better but its also quite an easy way to make a lot of things happen.
The new Linuxserver.io docker image at the very least has solved the annoying update cycle NextCloud has and seems to have fixed the need to do that every few months. I haven’t ever had it die but I don’t push it hard and I keep the plugins to a minimum because I just don’t trust it and it doesn’t run all that well.
I do both. I have a custom built NAS based on a Ryzen 3600 and ZFS across 4 drives which runs about 20 self hosted applications and stores the majority of my files but its only accessible from within the home. I also rent a small VPS for personal webspace and hosting self hosted apps I want out of the house.
In the past I have also hosted raw servers from Hetzner or bigger VPS from Amazon for the purpose of hosting a game server. Alongside those I often had community applications like website, forums, wikis and custom chat and voice comms services.
Its all self hosting to me since I run it. The various options are all about the trade offs of security, accessibility, cost and performance. The cheaper cloud options when you add it up can be cheap compared to buying and running your own hardware when you take into account electrical costs and the likely hardware replacement needs within 5 years. The big cloud providers aren’t price competitive but Contabo/Hetzner really surprisingly are especially if you pay a lot for electricity. But then if you need a game server it can be quite hard to find good fast CPUs on the cloud and its not going to be 24/7 for communities, so the trade off flips back to having your own.
Since I got 1 gbit/s fibre internet my need for internal NAS has definitely reduced as the internet is nearly as fast as the local network so I could now have my NAS needs remote.
Even the main search engines don’t index the entire internet of content these days and their databases are truly massive already. Writing a basic web crawler to produce a search index isn’t all that hard (I used to do it as a programming exercise for applicants) but dealing with the volume of data of the entire internet and storing it to produce a worthwhile search engine however is just not feasible on home hardware, it would be TB’s at least. It wouldn’t just be a little worse it would dramatically worse unless you put substantial resources to it including enormous amounts of network bandwidth which would have your ISP questioning your “unlimited 1 gbps fibre” contract. It would probably take years to get decent and always be many months out of date at best.
Doesn’t seem practical to try to self host based on the need to download and index every single page of the internet its a truly massive scale problem.
AMD has unfortunately a long history of abandoning products before its reasonable on its graphics division. Its not really acceptable, up until earlier this year my NAS/server was running a 3600 and its only for power saving purposes I changed that as its still a very workable CPU in that role.