poop

  • 1 Post
  • 198 Comments
Joined 1Y ago
cake
Cake day: Jun 11, 2023

help-circle
rss

These are the people that complain to their ISP when their game ‘lags’ on their wireless connected computer several rooms away from the router.


can you run something like iperf3 or openspeedtest between the server and client to prove its a network throughput issue?

do you have a network switch you can add to avoid switching through your router (if it is indeed bad?)

Have you ensured you arent unknowingly using wifi at either end?


NGINX is a bit more hands on than some other options but it’s mature, configurable and there’s a huge amount of information out there for setting it up for various use cases.

in my case, its what I set up when i was first getting into this and it works, so I don’t want to go through setting up anything else.


Thanks for the insightful and helpful comment.


Unraid is great and I have been using it for over a decade now, but a paid OS on a 2bay nas seems excessive


I cant say I care as much as I used to, since encoding has gotten quite good, but I have also gotten better at seeing (aka. worse at being distracted by) compression artifacts so while I am less of a perfect remux rip supremacist, I’m also more sensitive to bad encodes so its a double edged sword.

I still seek out the highest quality versions of things that I personally care about, but I don’t seek those out for absolutely everything like I used to. I recently saved 12TB running a slight compression pass on my non-4k movie library, turning (for example) a 30gb 1080p Bluray Remux into a 20gb H265 high bitrate encode, which made more room for more full fat 4K bluray files for things I care about, and the few 1080p full remuxes I want to keep for rarities and things that arent as good from the 4k releases or the ones where the 4k release was drastically different (like the LOTR 4k’s having poor dynamic range and the colours being changed for the Matrix etc), which I may encode in the future to save more space again. I know I can compress an 80gb UHD bluray file down to 60gb with zero noticeable loss, thats as far as I need to go, I don’t need to go down to 10gigs like some release groups try to do, and at that level of compression you might as well be at 1080p.

I cant go as low as a low bitrate 720p movie these days as I’m very close to a large screen so they tend to look quite poor, soft edges, banded gradients, motion artifacts, poor sound etc. but if I were on a smaller screen or watching movies on a phone like I used to, I probably wouldn’t care as much.

Another side to my choice to compress is that I have about 10 active Plex clients at the moment and previously they were mostly getting transcoded feeds (mostly from remux sources) but now most of them are getting a better quality encode (slow CPU encode VS fast GPU stream) direct to their screens, so while I’ve compressed a decent chunk of the library, my clients are getting better quality feeds from it.


I use Plexamp for that, Jellyfin does it too. You can assign libraries per user quite easily.

So for 3 users you might have 4 libraries, one per user then a shared library they all have access to.


I have complete ROM sets for a couple of platforms in my archive, they’re available on SLSK but not a huge amount of bandwidth available.

Sad to see the old giants like Vimms finally being attacked after all these years.


the 2.5" size of disks are now mostly direct USB controller disks rather than sata adapters internally.

3.5" disks are still SATA as far as i’ve seen but the actual sku’s of the disks are often the lower grades. like you will get a disk that looks like another good disk but with only 64mb of dram instead of 256 on the one you would buy as a bare internal drive for example so they can end up a bit slower. and warranties are usually void.


Used to be my main source of disks, but these days there are better ways and it is easier to know exactly what you are getting.


Are you transcoding?

4mbit per client for 1080 is generally a workable minimum for the average casual watcher if you have H265 compatible clients (and a decent encoder, like a modern intel CPU for example), 6 - 8mbit per client if its H264 only.

Remember that the bitrate to quality curve for live transcoding isn’t as good as a slow, non-real-time encode done the brute force way on a CPU. so if you have a few videos that look great at 4mbit, dont assume your own transcodes will look quite that nice, you’re using a GPU to get it done as quickly as possible, with acceptable quality, not as slowly and carefully as possible for the best compression.


You’re confusing a container format (MKV) with a video codec (AV1)

MKV is just a container like a folder or zip file that contains the video stream (or streams, technically you can have multiple) which could be in H264, H265, AV1 etc etc, along with audio streams, subtitles and many other files that go along, like custom Fonts, Posters, etc etc.

As for the codec itself, AV1 done properly is a very good codec but to be visually lossless it isn’t significantly better than a good H265 encode without doing painfully slow CPU encodes, rather than fast efficient GPU encodes. people that are compressing their entire libraries to AV1 are sacrificing a small amount of quality, and some people are more sensitive to its flaws than others. in my case I try to avoid re-encoding in general. AV1 is also less supported on TVs and Media players, so you run into issues with some devices not playing them at all, or having to use CPU decoding.

So I still have my media in mostly untouched original formats, some of my old movie archives and things that aren’t critical like daily shows are H265 encoded for a bit of space saving without risking compatibility issues. Most of my important media and movies are not re-encoded at all, if I rip a bluray I store the video stream that was on the disk untouched.


N5095 ? lots of reports of that one not supporting everything it should based on other Jasper Lake chips, CPU getting hit for Decode when it shouldn’t for example. Also HDR to SDR cant be accelerated with VPP on that one as far as I know so the CPU gets smashed. I think you can do it with OpenCL though.


Was it an n100? They have a severely limited power budget of 6w compared to the n95 at 25w or so.

I’m running jellyfin ontop of ubuntu desktop while also playing retro games. That all sits in a proxmox vm with other services running alongside it. It’s perfectly snappy.


One of my miniPCs is just a little N95 and it can easily transcode 4K HDR to 1080p (HDR or tonemapped SDR) to a couple of clients, and with excellent image quality. You could build a nice little server with a modern i3 and 16gigs of ram and it would smash through 4 or 5 high bitrate 4K HDR transcodes just fine.

Is that one transcoding client local to you? or are you trying to stream over the web? if it’s local, put some of the budget to a new player for that screen perhaps?


I’ve had good luck with WD Blue NVME (SN550)

I’ve put several of those into machines at work and have had years without an issue. I’m also running a WD Blue SN550 1TB in my server as one of the caches, 25000 hours power on time, >100TB written, temperatures way higher than they should be and still over 93% health remaining according to smart.


Sonarr/Radarr etc make it very easy and safe for media, but apps and games would be more of a serious sit down and talk kind of situation as more can go wrong there.


Soulseek is more like an old school peer to peer network like kazaa, limewire, winmx, ed2k etc.

I haven’t seen any clients with a playlist downloader, though that sounds like a cool feature to suggest.

You don’t have to seed.


Yea I’ve got 3 in my video distribution rig at the moment, one 2015, a 2017 and a 2019 and they are all going strong, all on projectivy and some adb tweaks though.



You should consider upgrading to some kind of mesh system then. sure they aren’t perfect, but even a basic 3 node kit could probably increase your throughput ten fold. If you want to use DDWRT or OpnSense or whatever you can still run it separately and route internet traffic or use it for your DHCP server.

To stream a 4K bluray remux rips on your Lan you need a solid 150mbit minimum between server and player to be reliable for example. I am hardwired all the way except for mobiles, but even on Wi-Fi I can easily pull 400-500mbit real world throughput through most of the house thanks to my Wifi6 setup with multiple APs


The shield pro 2019 is probably still the best overall, it’s not perfect as there are some weaknesses due to the age of its chipset, but for all the common formats used in Movies and TV it works perfectly, especially if you are playing full remux files, not re-encoded compressed video. Kodi runs very well, Plex runs very well, Jellyfin is mostly perfect too, but has some limitations in the current version.

Yes it supports HDR10 (not10+) and Dolby Vision, which covers 98% of all 4K blurays and TV shows, anything HDR10+ just gets played in HDR10 compatibility mode, if you TV doesn’t do DV it plays the HDR10 layer on 99% of files. There are some issues with HLG as it isnt properly supported but you don’t come across that format all that often and there is usually an SDR or regular HDR version available, if your TV supports manually activating HLG then it works fine.

Yes there is a minor colour bug in some DV content, no it isn’t the end of the world as some people make it out to be.

It is one of the only players that will give you full DTS:X and Dolby Atmos support, it has a very nice configurable upscaler for lower res content (AI upscale on low works excellently with minimal artifacts), it still has a lot going for it despite its age.

Also its easy to decrapify with ADB, you can easily configure third party launchers and other fun stuff.


What is your network infrastructure that is giving you those poor performance numbers?

Most consumer all in one routers are crap but not that bad. the file server should always connect to the main hub of the network with Ethernet (whether that be the router, a switch or an all-in-one crap box), these days pretty much everything should be at least 1gigabit.

Are you trying to use wifi for everything? that’s a recipe for disaster unless you really know what you are doing and have multiple APs and careful signal strength and channel management


Prowlarr is good because it combines usenet indexers and torrents. Makes it very easy to search for anything and compare versions/sources.


Yip, I have a Linux VM running on one of my boxes in the garage that is plugged into a video matrix so I can bring it up on any screen in the house, I use the pi to connect Keyboard/Mouse/controllers etc to that when I’m using it.


I use Ubooquity and Komga, both mainly for the OPDS service which I access on various devices.

Ubooquity is good for basic book and file serving, but does support graphics. Komga is very much graphic focussed and is very good at it.


I replaced 4x Pi4 4gb with a single N95 mini PC with 16gb ram and wont look back.

Only PI left in my home is just running a 24/7 USBIP bridge.

the only reason to use a pi is if you need GPIO pins for custom devices.


In most cases yes, but hdd space is cheap enough that lossless compression is just the best option. Can always use them as originals to spin off mp3s or other compressed files when needed.

300cds would only be around 120 gigs flac compressed


I just pull no more than an album at a time from people usually. spread it out, come back the next day etc. If you aren’t sure, use the chat function and ask them if they are OK with you queuing up more than a couple of full albums at a time.

I share freely, no restrictions other than bandwidth cap and use a round robin to allocate upload slots to people, so if someone does queue up a hundred gigs of flacs from me (in my library that can be a single artist) it doesn’t block everyone else for a week.


There are a few models available with 1920x1080 displays, but they are mostly still 4:3 screens and VGA inputs.


I have IPMI and web interfaces for most gear, I just don’t want to have to carry a laptop in every-time I need to tinker.

I also have a bunch of AV switchgear and it would be handy to adapt a multiviewer to one of the VGA ports for monitoring that side of things too.


the Arr apps will automate downloads but you can go into their ui’s manually for overriding things when needed (like replacing a bad copy of a TV show for example), jellyseer/overseer handles requesting and adding new shows/movies to be monitored from a simple webapp that you would host on the server and give them a shortcut to on their devices homepage.

I’d go with a 12th gen or newer intel cpu, something small and entry level is more than enough like a 12100 or 12400, we just want the igpu to handle the occasional transcode, 16gb of ram, a cache SSD or two in a mirror, and a decent stack of HDDs of your choice, the OS can be anything you want but I suggest going with something NAS focused like unraid, openmediavault or truenas (jellyfin is not officially supported on truenas but it does work). if it’s a new build from scratch for long term archival of high quality media i’d start with at least 6 HDDs, with one for parity, if you can budget for 20tb drives for example that gives you a spacious 100tb of useable space with the ability for any one disk to fail without any data loss. you can then build that into a normal ATX PC case.

You can use windows or any flavour of linux but you will be doing more work to make them work properly, where the above solutions are more plug and play.

I would make sure their hardware is capable of playing as many file formats and codecs directly as possible though, when you get into hosting 4K media, particularly for full fat UHD Bluray rips, you will find apps built into TVs or lower end streaming boxes just cant do it and the server has to chug through transcoding on the fly, the igpu can do it just fine, but you should try to avoid it for maximum performance and image quality, so perhaps budget for an nvidia shield or something.


Plenty of players still don’t support AV1, though with a modern gpu you can brute force transcoding it on the fly.

I have a jellyfin test server running an an intel n95 and it can easily handle a couple of 4k AV1 to H265 transcoded streams on it’s own with decent image quality, but it struggles with 3 if the bitrates are too high. still, it’s more complexity than is needed considering AV1 only saves a small amount of space over a good H265 encode which are ubiquitous on the net.


I put of grabbing one of these when my work was clearing them out, giving them away for next to nothing.

Now when I actually need one I cant find one for less than the cost of my best damn server. and nobody seems to make a basic cheap one.


I don’t think he’s enough of an asshole to admin that instance.

You can get banned there for sneezing the wrong way.

Hell I once got banned for talking about being banned.


Is it related to this issue posted to the bsd forks github?

I cant help you directly as I run Jellyfin on linux, but that should be your first port of call. just keep in mind jellyfin on FreeBSD is 100% unofficial so you are on your own.


For maximum compatibility with all services you are limited in your choices due to DRM licence requirements.

You can mostly decrapify android based boxes via ADB to strip out much of the bloat, strip most of the telemetry entirely then block the rest in your firewall, and replace the launcher with a super barebones one like Flauncher but it will never be 100% perfect.

If you must be in full control of what is on the device and what it is doing, a small, low powered miniPC (intel n100 is a good chip for basic AV for example, 4k 10bit with perfect H265 and AV1 decoding) and use the operating system of your choice, but you are then limited in what you can stream via browser or third party apps, often in nowhere near full quality, again this is due to licencing and drm.

The best option is to avoid streaming services altogether and download your own content, then use an offline player like kodi or a server/client solution like Jellyfin (a free and open alternative to plex, with most of the base features well implemented) to play it.


I have an Onyx Boox tablet and use ubooquity as an ebook OPDS server on my unraid box at home, it has an online reader that’s pretty good, but I just download the ebook file to local storage and use the much better reader built into the system. I’m a slow reader so I dont have to do it often.

I haven’t really found a third party reader that is e-ink optimised and can seamlessly integrate an OPDS server. I’d like to find one, particularly if it has syncing between devices as I also use a foldy phone as my main device so it seems some use as a reader sometimes.

I also self host a huge archive of manga in Komga, and access that on the tablet and phone via a tachiyomi fork, it handles e-ink optimisation pretty well. It also doesn’t sync between devices but if I use the komga web reader it does, it’s just a bit power hungry on the Boox and has no offline functionality so I just manually keep in sync which isn’t that hard.


While it’s unlikely your ISP is blocking all uploads on that protocol, a vpn would bypass that.

So a vpn is worth testing


What does your Docker port mapping look like?

Perhaps the port is open and forwarded in the router but not getting through the Docker network? Are you mapping 1:1 or using different internal/external ports?


Finding sources for niche media - Surround Music
Gday folks, Has anyone have any luck tracking down a source for surround and or Atmos music? whether it by DVD-A, BD-A, SACD, DTS-CD etc etc. Not looking for concerts here (I have plenty of those) but proper albums specifically mixed in multichannel and spacial formats. I have pretty much everything I can find on Usenet and public trackers and have backed up all of my physical media but there is a lot out there that I know exists.
fedilink