Formerly /u/neoKushan on reddit

  • 0 Posts
  • 41 Comments
Joined 1Y ago
cake
Cake day: Jun 16, 2023

help-circle
rss

Another recommendation for tdarr, set it up in January and let it transcode away, going to h265 for all my media - saved me over 40TB of space so far and I haven’t noticed a massive drop In quality or had any playback issues.


I think the argument about “for accessibility” is missing the point a little bit and a common mistake most developers make.

You should endeavour to make your interface accessible by default. You shouldn’t be thinking in terms of “okay here’s the design and here’s the design that’s accessible”, you should be considering accessibility in all of your designs.

Now that’s usually a bit harder with games because you have styles and themes that you don’t want to detract from, but if your interface causes accessibility issues, it’s generally going to be bad for people that don’t have accessibility needs as well.

Accessibility benefits everyone.


I don’t think your second point is correct. You can still embed analytics on a static website. I believe you’re conflating it with your first point by assuming that scripts are disabled on the browser side, in which case it’s a bit of a redundant point.

I also think it’s a bit unrealistic in this day and age to run with scripts completely disabled. I know it sucks, but we need better ways of protecting our privacy and disabling all scripts is a bit of an extreme measure given so much of the modern web relies on it.


You can shove most services behind cloudflare’s CDN with a bit of jiggery pokery. I’ve used netlify + cloudflare’s free tiers to great success a few times now.


Yeah honestly either solution is a solid one


The guy above you gives great advice. Set up SWAG, then the only ports you’re exposing are 443.

Once you have that set up, look at adding something like authelia. This will give you 2FA on top of those apps meaning even if someone guesses the password and the URL to access them, they still won’t be able to.


I appreciate what this project is doing. I’ve already got my setup configured using the trash guides, with recyclarr pulling in the latest config data for it. Is there a benefit to switching to Dictionarry, anyone know?


tachiyomi

Free and open source manga reader for Android.

(For those wondering what this discussion is about)


That "traffic between two IP addresse"s is enough reason to use a VPN you trust.

Put it this way, bit torrent traffic can be encrypted and routed over standard ports to make it look like regular web traffic, so still “just traffic between two IP addresses” but you wouldn’t run that without a VPN, would you?


The rights to search sure are, but it’s more like Google happens to be the one paying it right now. It could be Microsoft or Yahoo or anyone.

Mozilla definitely needs to diversity better here, but the implication that they’re “funded by google” is completely misleading.


I dont know much about the primary developers of Lemmy,

With respect, maybe you shouldn’t be commenting on what’s going on behind the scenes. They are good developers but they’re not good leaders or shepherds of such a big project. They need to hand over stewardship to someone that can be trusted.


Google pays them to be the default search engine, they’re not funded by Google.


We desperately need a company like Mozilla to take the reigns of something like Lemmy. The original developers are far too biased and short sighted to see the bigger picture, it needs to be an independent group that promotes more open source development.



I think those seed boxes you mentioned are the main reason OP isn’t using all their bandwidth. In the same way you suggest limiting total connections, those downloading will also have a limited number of connections so of course you’ll prioritise those on a gigabit+ uplink than those on slower links.

It all adds up and it all helps, of course.


If you read up through the thread, the person I responded to specifically said about transmission being the easiest to run via docker.



If you’re trying to build it all from scratch, sure, but you specifically mentioned docker and there’s plenty of high-quality docker images you can use - and it’s no harder to use a qBittorrent docker image than a transmission docker image.

Here’s the docker command for transmission:

docker run -d \
  --name=transmission \
  -e PUID=1000 \
  -e PGID=1000 \
  -e TZ=Etc/UTC \
  -e TRANSMISSION_WEB_HOME= `#optional` \
  -e USER= `#optional` \
  -e PASS= `#optional` \
  -e WHITELIST= `#optional` \
  -e PEERPORT= `#optional` \
  -e HOST_WHITELIST= `#optional` \
  -p 9091:9091 \
  -p 51413:51413 \
  -p 51413:51413/udp \
  -v /path/to/data:/config \
  -v /path/to/downloads:/downloads \
  -v /path/to/watch/folder:/watch \
  --restart unless-stopped \
  lscr.io/linuxserver/transmission:latest

and the equivelant for qBitTorrent:

docker run -d \
  --name=qbittorrent \
  -e PUID=1000 \
  -e PGID=1000 \
  -e TZ=Etc/UTC \
  -e WEBUI_PORT=8080 \
  -p 8080:8080 \
  -p 6881:6881 \
  -p 6881:6881/udp \
  -v /path/to/appdata/config:/config \
  -v /path/to/downloads:/downloads \
  --restart unless-stopped \
  lscr.io/linuxserver/qbittorrent:latest

I’m not even going to argue that the qBitTorrent docker image is technically easier as it has less to configure, it’s all one command at the end of the day.


Qbitorrent, rtorrent and deluge can be run via docker with a web interface.


That the entire industry is cyclical and the current trends are yesterday’s anarcisms. Oop Vs functional, separating concerns Vs vertical slices, there’s examples all over the place.

All of this has happened before and all of this will happen again.


I understand your reasoning for not setting up the other *arr apps, due to not having a dedicated server to run them, however you’d still benefit from running them on your PC. They handle the downloading, extraction, categorising and naming of the media you want and they can do that automatically.

Even on your computer, that’ll save you time and effort, you can just tell it what shows you want - even shows that aren’t out yet and it’ll grab them for you whenever they appear. It’s great for when you enjoy a show and the next season starts, it just grabs it for you and the show appears one day.

A lot of people start this way and it’s only then they think about getting a dedicated device for it - such a device can be a decent little Synology or QNAP NAS, something small, quiet and power efficient but I’d definitely say you don’t need to start there. It’s worth the effort to try though, believe me.


Yeah if it was just a switch I’d be fine, but for gateway/firewall options it’s a bit of a bugger unless I want a 1U device


I wish they had more 2.5G or even SPF+ options in this range. I’m lucky enough to have a >1gigabit home connection but router options are surprisingly limited if I want that full connection speed going to my server


Why are you fixing his PR’s? Reject them for now following your own practices and link to the documentation about those practices that the PR violates.

You’re not holding up the sprint doing this, he is. As a team, you agreed these practices and everyone needs to follow them. If he refuses, raise it with his line manager.

Either his Line manager will put him in line, or he’ll agree that the standards you decided upon don’t need to be followed. Take your pick.


As much as there’s an Activision fuckup here, there’s also a Hasbro fuckup. When you do a deal like this with a publisher, part of that agreement should include provisions for the source code - either directly sending it or if Activision didn’t want to share its proprietary code, indirectly via an Escrow service.


Sure, it’s not an easy thing to achieve for sure, but I won’t lose sleep over them losing revenue because they can’t figure it out quickly enough.

Even moreso where it comes to media that’s just not available any more. If you, a content IP owner, don’t make that content available for purchase, then you have only yourself to blame if people pirate it.



I pay for a smattering of VoD services, I don’t lose sleep over watching something that isn’t available on them.

If corporate greed didn’t force a hundred different services on us, then it might be different.


Yeah, it’s very powerful, it just needs a bit of a UX refresh.


Yeah to echo other comments in here, it sounds like there’s some kind of config issue somewhere. The hue integration should “just work” and it’s where I get most of my own utility from.

I will say though, HA isn’t as user friendly as it could be. It has come a long way and it’s getting better (and there is definitely nothing better at the moment), but there are still some gremlins in there that require you to hand edit config files or understand obscure device names and things like that.


The same way you know how many times a show was watched legitimately, you take a sample of known data and extrapolate it from there. It’s basically guesswork but it’s educated guesswork.

BitTorrent, even though it’s decentralised, is still operating on the public internet using public, known protocols. You can join a swarm and get an idea of how big that swarm is with a small amount of data inspection. I mean, your torrent client knows how many seeders and leechers there are, right? Just watch the swarms and extrapolate from there.

Any time you read these articles, they’re always caveated with something similar to “The number could be much higher than that” too because it’s not just torrents, you’ve got newsgroups, file shares, streaming sites, even old school IRC, people putting titles on a USB stick and so on. Hence there’s a lot of guessing, but it’s not entirely plucked from thin air.

Where it does get more bullshitty is when they try to translate those numbers into lost sales. That is just made up numbers as far as I’m concerned.


Honestly if you’re familiar with setting up collections and libraries yourself, then these kinds of things should be pretty straightforward. There’s loads of RPi specific images that are quite literally “get a big enough sd card, use etecher to burn this image onto it, plug it in and away you go”. There’s some slightly more complex setups that let you plug in external drives but they tend to come with instructions specific to them.


Don’t get suckered in by these, but instead look up a site like arcade punks that’ll let you download images that are similar in nature, just bring your own hard drive. There’s loads for loading up a raspberry pi with, all preconfigured for you.


Does RADAR’s logs say anything around the file copy about why it couldn’t hardlink?



Good luck, don’t be afraid to ask questions. It’s a lot to take in the first time you go through it all, especially if you’re not familiar with Docker and the concepts of containerisation but once you crack it, it’s seamless.


How has your Lemmy experience been on a self hosted instance? I’m currently using lemmy.world and it’s very error prone, would self hosting reduce those errors at the expense of anything? Does federation take long or do you find you’re getting federated content quickly enough?


Hi,

You need to set up the folder paths within the docker side and the application side in order to do this properly.

Luckily there’s a set of guides that describes exactly what you need: https://trash-guides.info/Hardlinks/How-to-setup-for/Synology/

Read this guide and the other pages to get a full idea of the setup required.



Plex is available in a lot more app stores than Jellyfin or Emby is. I run a plex server for friends but I use emby for my personal consumption. The reason I continue to use plex is because it’s available on all sorts of smart TV’s and semi-obscure streaming devices that Jellyfin isn’t.