Formerly /u/neoKushan on reddit
I think the argument about “for accessibility” is missing the point a little bit and a common mistake most developers make.
You should endeavour to make your interface accessible by default. You shouldn’t be thinking in terms of “okay here’s the design and here’s the design that’s accessible”, you should be considering accessibility in all of your designs.
Now that’s usually a bit harder with games because you have styles and themes that you don’t want to detract from, but if your interface causes accessibility issues, it’s generally going to be bad for people that don’t have accessibility needs as well.
Accessibility benefits everyone.
I don’t think your second point is correct. You can still embed analytics on a static website. I believe you’re conflating it with your first point by assuming that scripts are disabled on the browser side, in which case it’s a bit of a redundant point.
I also think it’s a bit unrealistic in this day and age to run with scripts completely disabled. I know it sucks, but we need better ways of protecting our privacy and disabling all scripts is a bit of an extreme measure given so much of the modern web relies on it.
The guy above you gives great advice. Set up SWAG, then the only ports you’re exposing are 443.
Once you have that set up, look at adding something like authelia. This will give you 2FA on top of those apps meaning even if someone guesses the password and the URL to access them, they still won’t be able to.
That "traffic between two IP addresse"s is enough reason to use a VPN you trust.
Put it this way, bit torrent traffic can be encrypted and routed over standard ports to make it look like regular web traffic, so still “just traffic between two IP addresses” but you wouldn’t run that without a VPN, would you?
I think those seed boxes you mentioned are the main reason OP isn’t using all their bandwidth. In the same way you suggest limiting total connections, those downloading will also have a limited number of connections so of course you’ll prioritise those on a gigabit+ uplink than those on slower links.
It all adds up and it all helps, of course.
If you’re trying to build it all from scratch, sure, but you specifically mentioned docker and there’s plenty of high-quality docker images you can use - and it’s no harder to use a qBittorrent docker image than a transmission docker image.
Here’s the docker command for transmission:
docker run -d \
--name=transmission \
-e PUID=1000 \
-e PGID=1000 \
-e TZ=Etc/UTC \
-e TRANSMISSION_WEB_HOME= `#optional` \
-e USER= `#optional` \
-e PASS= `#optional` \
-e WHITELIST= `#optional` \
-e PEERPORT= `#optional` \
-e HOST_WHITELIST= `#optional` \
-p 9091:9091 \
-p 51413:51413 \
-p 51413:51413/udp \
-v /path/to/data:/config \
-v /path/to/downloads:/downloads \
-v /path/to/watch/folder:/watch \
--restart unless-stopped \
lscr.io/linuxserver/transmission:latest
and the equivelant for qBitTorrent:
docker run -d \
--name=qbittorrent \
-e PUID=1000 \
-e PGID=1000 \
-e TZ=Etc/UTC \
-e WEBUI_PORT=8080 \
-p 8080:8080 \
-p 6881:6881 \
-p 6881:6881/udp \
-v /path/to/appdata/config:/config \
-v /path/to/downloads:/downloads \
--restart unless-stopped \
lscr.io/linuxserver/qbittorrent:latest
I’m not even going to argue that the qBitTorrent docker image is technically easier as it has less to configure, it’s all one command at the end of the day.
I understand your reasoning for not setting up the other *arr apps, due to not having a dedicated server to run them, however you’d still benefit from running them on your PC. They handle the downloading, extraction, categorising and naming of the media you want and they can do that automatically.
Even on your computer, that’ll save you time and effort, you can just tell it what shows you want - even shows that aren’t out yet and it’ll grab them for you whenever they appear. It’s great for when you enjoy a show and the next season starts, it just grabs it for you and the show appears one day.
A lot of people start this way and it’s only then they think about getting a dedicated device for it - such a device can be a decent little Synology or QNAP NAS, something small, quiet and power efficient but I’d definitely say you don’t need to start there. It’s worth the effort to try though, believe me.
Why are you fixing his PR’s? Reject them for now following your own practices and link to the documentation about those practices that the PR violates.
You’re not holding up the sprint doing this, he is. As a team, you agreed these practices and everyone needs to follow them. If he refuses, raise it with his line manager.
Either his Line manager will put him in line, or he’ll agree that the standards you decided upon don’t need to be followed. Take your pick.
As much as there’s an Activision fuckup here, there’s also a Hasbro fuckup. When you do a deal like this with a publisher, part of that agreement should include provisions for the source code - either directly sending it or if Activision didn’t want to share its proprietary code, indirectly via an Escrow service.
Sure, it’s not an easy thing to achieve for sure, but I won’t lose sleep over them losing revenue because they can’t figure it out quickly enough.
Even moreso where it comes to media that’s just not available any more. If you, a content IP owner, don’t make that content available for purchase, then you have only yourself to blame if people pirate it.
Yeah to echo other comments in here, it sounds like there’s some kind of config issue somewhere. The hue integration should “just work” and it’s where I get most of my own utility from.
I will say though, HA isn’t as user friendly as it could be. It has come a long way and it’s getting better (and there is definitely nothing better at the moment), but there are still some gremlins in there that require you to hand edit config files or understand obscure device names and things like that.
The same way you know how many times a show was watched legitimately, you take a sample of known data and extrapolate it from there. It’s basically guesswork but it’s educated guesswork.
BitTorrent, even though it’s decentralised, is still operating on the public internet using public, known protocols. You can join a swarm and get an idea of how big that swarm is with a small amount of data inspection. I mean, your torrent client knows how many seeders and leechers there are, right? Just watch the swarms and extrapolate from there.
Any time you read these articles, they’re always caveated with something similar to “The number could be much higher than that” too because it’s not just torrents, you’ve got newsgroups, file shares, streaming sites, even old school IRC, people putting titles on a USB stick and so on. Hence there’s a lot of guessing, but it’s not entirely plucked from thin air.
Where it does get more bullshitty is when they try to translate those numbers into lost sales. That is just made up numbers as far as I’m concerned.
Honestly if you’re familiar with setting up collections and libraries yourself, then these kinds of things should be pretty straightforward. There’s loads of RPi specific images that are quite literally “get a big enough sd card, use etecher to burn this image onto it, plug it in and away you go”. There’s some slightly more complex setups that let you plug in external drives but they tend to come with instructions specific to them.
Hi,
You need to set up the folder paths within the docker side and the application side in order to do this properly.
Luckily there’s a set of guides that describes exactly what you need: https://trash-guides.info/Hardlinks/How-to-setup-for/Synology/
Read this guide and the other pages to get a full idea of the setup required.
Another recommendation for tdarr, set it up in January and let it transcode away, going to h265 for all my media - saved me over 40TB of space so far and I haven’t noticed a massive drop In quality or had any playback issues.