• 0 Posts
  • 22 Comments
Joined 1Y ago
cake
Cake day: Jul 13, 2023

help-circle
rss

Cli doesn’t make much sense to me either when the *arr suite has a well documented rest API already.


I hate writing and reading xml compared to json, I don’t really care if one is slightly leaner than the other. If your concern is the size or speed you should probably be rethinking how you serialize the data anyway (orotobuff/DB)


Thanks for the breakdown! This is probably the most helpful breakdown I’ve seen of a build like this.


Yea I do, you brought up that local isn’t always the option.

I desperately want it to work for me, i just can’t get it to work without spending thousands of dollars on hardware just to get back to the same experience as having a regular desktop at my desk.


What is the cost of the thin clients and are you doing this over copper?

Are your desks multi monitor? To get the bare minimum in my households scenario I would need at least 12 streams at greater than 1080p

For 5 seats how much did it cost versus just having a computer in each location? For example looking at hdbaset to replace just my desk setup, I would need 4 ~$350 devices, just looking at monoprice for an idea (https://www.monoprice.com/product?p_id=21669) which doesn’t even cover all of the screens in my office.


Right, but who has the resources to rent compute with multiple GPUs, this is a gaming setup, not office work, and the op was talking about racking it.

All of those services offer an inferior experience to being at the hardware, it’s just not the same experience. Seriously, try it with multiple 1440p 144hz displays, it just doesn’t happen work out well, you are getting a compromised product for a higher cost. You need a good GPU (or at least a way to decode multiple hvec streams) in in the client, and so, you can run a standard thin client.

‘low latency’ is a near native experience, I’m talking, you sit down at your desk and it feels like you are at your computer(as to say, multiple monitors, hdr, USB swapping, Bluetooth, audio, etc, all working seamlessly without noticeably diminished quality), anything less isn’t worth it, since you can just, use your computer like normal.


A display port to fiber extender is $2,000. The fiber is not for the network.

Moonlight does not do what I want, moonlight requires a GPU on the thin client to decode. You would need a high end GPU to decide multiple high resolution video streams. Also afaik, moonlight doesn’t support multiple displays.


Can this solution deliver 3+ streams of high resolution (1440p or higher and 144fps) low latency video with no artifacting and near native performance and responsiveness?

Gaming has a high requirement for high fidelity and low latency I/O, no one wants to spend all this money on racks and thin clients, the then get laggy windows and scrolling, artifacts, video compression, and low resolution.

That’s the problem at hand with a gaming server, if you want to replace a gaming desktop with a vm in a rack, you need to actually get the I/O to the user somehow, either through dedicated cables from the rack, fiber, or networking, the first is impractical, it involves potentially 100ft long runs of multiple display port, HDMI, USB, etc, and is very rigid in its application, the second is very expensive, shooting the price up to thousands of dollars per seat for display port/USB over fiber extenders, and the third option I have yet to see a vnc/remote solution that can deliver near native video performance.

I should reiterate, the op wants to do fidelity sensitive tasks, like video editing, they don’t just need to work on a spreadsheet.


None of the presented solutions cover the aspect of being in a different place than the rack, the same network is fine, but at a minimum a different room.

How do you deliver high resolution (e.g. 1440p, 144 fps) to multiple monitors with low latency over a network? I haven’t seen anything like that accomplished without running fiber from the host.

Eventually, your thin client will need too much power anyway, making the costs rise a lot. It makes sense in an office where you have 500 seats and you can load balance resources.

If someone can show me a multi seat gaming server that has native remote performance (as in you drag windows around in 144 fps, not the standard artifacty high latency behavior of vnc) I’ll eat a shoe.


I’m inclined to agree, but it’s really just semantic differences. If they really wanted to, they could just release a new major version upgrade every year, tie the license to that version, and still get an effective annual subscription.


Yes that doesn’t mean that opensubtitles can’t take their new found savings and use it to weed out poor quality submissions.

Wikipedia editors do it for free and yet they still maintain a high standard of quality.


They often do not include many languages, sure if you want English subtitles it’s likely they will be there. But good luck getting subtitles for movies and shows that didn’t have an official in the given country.



So frustrating when a subtitle ad pops up spoiling that a movie is ending.

Something dramatic happening on screen and then “you can advertise on opensubtitles” appears on the bottom letting you know that there is 10 seconds left and spoiling any tension or drama in the scene.


I can download ten subtitles just trying to find one good one for a single TV show episode

Imo if they want to be so strict with downloading subtitles, they should raise the quality standards for the subtitles that are submitted.



I paid for a seed box for one month, around $25 and built enough ratio from that month to last me years.



Forever audits of free software are unsustainable in my opinion.

To truly audit every piece of software, you need an independent party to spend time (often more than the development) to look through the code, that person needs to be equally or more experienced than the developers of the software, and have specific knowledge for vulnerabilities and malicious techniques.

They then need to audit and monitor all of the channels of distribution for that software, including various websites and repositories. This needs to be done constantly.

You effectively need to double or more the total level of effort for all software.

Yes, high profile software (sometimes) gets audited regularly, but the assumption that anything you grab from your package manager has been truly audited leads to a false sense of security, additionally the assumption that an audit being performed means there are no issues with the code also leads to problems.

The reality is that most open source software doesn’t get audited because it is too much work.


Not sure about your budget, but I switched to a udm se and it’s pretty awesome, for me the benefit comes in with cameras and access control. the UI and off the shelf tooling is very nice with it.

Opensense is another more diy option.

I used an edge router 4 before the udm for a few years and it was pretty ok.


Them ‘supposed to be everywhere’ doesn’t change that fact that they litter up the sidewalk and use the public areas of my town as a pseudo frontage for their business.

I have no problem with the bike systems that have docs for the bikes, it centralizes the locations and keeps the bike organized.

It’s not ignorance, it’s a full understanding that they pollute the public areas and already limited walkways in my city.


My uneducated guess would be that, overall internet speeds have increased enough that streaming is feasible for most people and that there is a lot of healthy competition in music streaming, where almost all of the major competitors have nearly identical libraries, so you don’t need to pay 10 different subscriptions for music.

Additionally, there exists services like tidal which offer lossless streaming so even the hardcore people can get all (most) of their music legitimately.

Basically the music industry did the opposite of the movie/tv industry and generally figured out streaming a long time ago.