Me

  • 24 Posts
  • 263 Comments
Joined 1Y ago
cake
Cake day: Apr 29, 2023

help-circle
rss

Well, here is the relevant part then, sorry if it was not clear:

  • Jellyfin will not play well with reverse proxy auth. While the web interface can be put behind it, the API endpoints will need to be excluded from the authentication (IIRC there are some examples on the web) but the web part will stil force you to double login and canot identify the proxy auth passed down to it.
  • Jellyfin do support OIDC providers such Authelia and it’s perfectly possible to link the two, in this case as i was pointing out, Jellyfin will still use it’s own authentication login window and user management, so the proxy does not need to be modified.

TLDR: proxy auth doesnt work with Jellyfin, OIDC yes and it bypassess proxy, so in both cases proxy will not be involved.


This is my jellyfin nginx setup: https://wiki.gardiol.org/doku.php?id=services:jellyfin#reverse-proxy_configuration

currently i don’t use any proxy related authentication because i need to find the time to work with the plugins in Jellyfin. I don’t have any chromecast, but i do regularly use the Android Jellyfin app just fine.

I expect, using the OIDC plugin in jellyfin, that Jellyfin will still manage the login via Authelia itself, so i do not expect much changes in NGINX config (except, maybe, adding the endpoints).


Never found a service that don’t work with nginx reverse proxy.

My jelly fin does.

Don’t run photoprims tough…


You might use LDAP, but its total overkill.

I have not yet worked jellyfin with authelia, but its more or less the last piece and I don’t really care so far if its left out.

A good reverse proxy with https is mandatory, so start with that one. I mean, from all point of views, not login.

I have all my services behing nginx, then authelia linked to nginx. Some stuff works only with basic auth. Most works with headers anyway, so natively with authelia. Some bitches don’t, so I disable authelia for them. Annoying, but I have only four users so there is not much to keep in sync.


They actually do, i am down the same path recently and installing authelia was the best choice I made. Still working on it.

But most stvies support either basic auth, headers auth, oidc or similar approaches. Very few don’t.


Ok, I have a web browser on a locked down device and nothing else: how do I print a pdf or a photo using IPP?

I have: a camera, a browser, a file manager (kind of, think of an iPhone or some stock android business device) and I need to print a photo taken with the camera or a pdf file sent to me via email or WhatsApp?

The device is connected to the WiFi guest network with limited internet access (if any) and as only available service a server with port 443 open (a reverse proxy on that, captive portal and such).

In my experience, there is no way to print via cups in this configuration. Maybe I am wrong?


It still requires the device to be capable to print…

And the user to find the printer select it and so on. And must expose more ports on the network beside 443…

So, indeed cups is a great solution, but not to the problem I want to solve.

I do use cups in fact for the trusted part of the network, driverless printing for windows and Linux. Android doesn’t even need cups since it picks up the printer directly from the printer itself (AirPrint or whatevee that’s called).


I known cups can share printers and queues.

What is unclear?

I don’t want to pull drivers or install cups on devices. I want to print from anywhere just uploading a file to a web page.

If I have lots of devices or just want to let somebody print from his phone/tablet without installing or configuring anything…

With cups I still need to touch the system or the device somehow to let it print.


Yes, this is what I am afraid of… There is nothing out there for this task.

Hope to find something or maybe try to create something using lpr on the background… But this is the las hope as I have little time.


I want to print from a web page: upload the file, hit print button.

In this way I can print from whatever device I want even without any driver installed or configuration.


Yes, sure I am… Would probably prefer a bash CGI because I like challenges :)

(Author and such would be managed by my reverse proxy)

But I would prefer something already baked if it exist


You get a real IP? Its been cg-nat with every provider for the last many many years in Italy.

I got a cheap vps and just run some reverse tunnels to map ports from it to my home server going trough my cg-nat.


Web printing
Hi! I have setup ScanServJS which is an awesome web page that access your scanner and let you scan and download the scanned pages from your self hosted web server. I have the scanner configured via sane locally on the server and now I can scan via web from whatever device (phone, laptop, tablet, whatever) with the same consistent web interface for everyone. No need to configure drivers anywhere else. I want to do the same with printing. On my server, the printer is already configured using CUPS, and I can print from Linux laptops via shared cups printer. But that require a setup anyway, and while I could make it work for phones and tablets, I want to avoid that I would like to setup a nice web page, like for the scanner, where the users no matter the device they use, can upload files and print them. Without installing nor configuring anything on their devices. Is there anything that I can self-host to this end?
fedilink


Power outage worries
Hi fellow hosters! I do selfhost lots of stuff, starting from the classical '*Arrs all the way to SilberBullet and photos services. I even have two ISPs at home to manage failover in case one goes down, in fact I do rely on my home services a lot specially when I am not at home. The main server is a powerful but older laptop to which i have recently replaced the battery because of its age, but my storage is composed of two raid arrays, which are of course external jbods, and with external power supplies. A few years ago I purchased a cheap UPS, basically this one: EPYC® TETRYS - UPS https://amzn.eu/d/iTYYNsc Which works just fine and can sustain the two raids for long enough until any small power outage is gone. The downside is that the battery itself degrades quickly and every one or two years top it needs to be replaced, which is not only a cost but also an inconvenience because i usually find out always the worst possible time (power outage), of course! How do you tackle the issue in your setups? I need to mention that I live in the countryside. Power outages are like once or twice per year, so not big deal, just annoying.
fedilink

No, because i did it over long time since I encoded only during the day hours and a few episodes at a time.

I can say that I fit a good 30% more episodes in the same space, but at the same time I also have added movies and reduces sizes too, so hard to tell reasonably.

I can say that all my collection was mostly h264 before.


Maybe I have a different point of view here, but I have actively converted all my TV series to AV1 and will probably to the same to most of my movies.

The space saving is huge, and the quality is identical to my eyes and hardware. True that storage is cheaper than ever, but this is not a reason valid to waste it anyway.

I have only 6TB of storage for my media and the power needed to run additional disks would only be waste on the long run, and so buying new bogger disks would be a waste for stuff I don’t wantch often (more like hoarding than…).

So AV1 is the way. Software encoding is the best quality, I have heard, rather than hardware encoding. As for playback, I have a fire stick with AV1 support that works flawlessly, so.

Edit: I have FV at home, so converting to AV1 during daylight is actually free for me.


Italian content? Some, but not much. Way less that torrents.

I speak from experience.


For anybody waiting for invites: i have up and just waited until Black Friday deals and openings: you need to head to “the other site” and watch the deals and opening times and you can get anything you want.

Also, invites are not free: you still need to pay the subscription!


They get DMCA’d regularly and content get removed on Usenet as well. But the fact that they have to report literally thousands of individual files every time make it slow and inefficient. People will just reload the same item many times and it’s always there.

Each copy, each single file in which the copy is split needs to be identified and asked for removal. Compared to torrents, it’s a long and complex task.


Well, my experience is that unless you set static IP+DNS in android WiFi advanced networks, it will not obey the dhcp option 6.

LineageOS, vanilla with mind the gapps


Yes, perfectly… My guess is android bypass local resolver and goes via DoH l, which sucks hard


Shimitar
creator
toSelfhosted@lemmy.worldDNS issues
link
fedilink
English
120d

Old thread, but I have somehow solved by reinstalling unbound and nuking the old config file…


Thanks! This explains a few things… But not why Android is IGNORING my DNS pushed via DHCP even if private DNS is disabled…


It’s blatantly ignoring the DNS i set via DHCP it seems. Only if i set it manually (static) it will use it! I have no subnets


Private DNS is disabled.

And even ping from tmux fails…


Issue with local DNS and Android
I have a home network with an internal DNS resolver. I have some subdomains (public) that maps to a real world IP address, and maps to the home server private address when inside home. In short, i use unbound and have added some local-data entries so that when at home, those subdomains points to 192.168.x.y instead. All works perfectly fine from Windows and from Linux PCs. Android, instead, doesnt work. With dynamic DHCP allocation on android, the names cannot be resolved (ping will fail...) from the android devices. With specific global DNS servers (like dns.adguard.com) of course will always resolve to the public IP. The only solution i found is to disable DHCP for the Wifi on android and set a static IP with the 192.168.x.y as DNS server, in this case it will work. But why? Aynbody has any hints? It's like Android has some kind of DNS binding protection enabled by default, but i cannot find any information at all.
fedilink

Install some DNS test app for Android and check that it does get resolved.

My will resolve the home server address to 0.0.0.0, and I get the same network error.


I have a very similar issue. Seems like Android will bypass your DNS resolver and thus cannot resolve your local names.

I have my home services on “home.my domain.com” accessible from outside and re-mapped to “192.168.0.1” (my internal server IP) at home, and all PCs can access it while all android phones can only resolve to the public IP.

I feel it’s something related to DoT or similar but haven’t yet dig in that.


Been using USB since forever. I have one USB3 jbod with four SSDs (2 RAID1 + 2 RAID1) and an USB-C double enclosure with two spinning disks (RAID1 as well)

Setup is running flawless since 2016 at the very least, but I was using USB even before that timeframe.

All the RAIDs are Linux software RAID.


As codecs go, I love AV1. More and more hardware support everywhere, it seems more future proof than h265 even at this point.

Rip at native DVD resolution and don’t forget ALL the audio tracks, as many English users always forget about the rest of the world.


My guess you had broken cables or defective connectors. Because even on cat5 (not cat5e) you should get much more than 7mbit, or did you have coaxial? LoL.

In my experience 90% are plugs, specially if you crimped yourself with Chinese tools


Had the same issues, it was heat.

Cool down your server, add a fan or a cooler…

I added a usb-powered fan sucking cooler air from outside the server area directly blowing it on the chassis.

That fixed for me.


Indeed you are, but if you release it you probably want people to use it, otherwise why release at all? Going github/gitlab way just make that easier.

You are free to do what you want ofc, just my thoughts.


Why not publish on github or gitlab? Or setup a gitea/similar website and post it?

I did so with a few of my small stuff.


I have read that kopia has corrupted Dara systematically in the past. What’s your experience with it?




Restic or Borg on your side, a safe and remote destination on the other side.

use restic, with backrest web GUI, and cannot be happier.

As for remote site, I use a remote machine I rent, but there are plenty of providers around, shop a bit… Or find a friend for reciprocal backup?


Synchting is the way… Been using it since ever. Easy to setup, works flawlessly, doesn’t get into the way.

I use it for server, PC, laptop and android devices.

… Then you want to backup (Borg, restic…) your synched files of course …



Nginx “just works™” had never got into the way, its been rock solid and has not changed significantly over the years.

Why would I need something else?


How to download amazon prime movies
As the title goes, is there a way to download content from amazon prime video? Like yt-dl or similar...
fedilink

DNS issues
Hi! i am selfhosting my services and using a DNSMasq setup to provide ad-blocking to my home network. I was thinkering with Unbound to add a fully independent DNS resolver and not depend on Google/Adblock/Whatever upstream DNS server but i am unable to make Unbound work. Top Level Domains (like com, org...) are resolved fine, but anything at second level doesn't. I am using "dig" (of course i am on linux) and Unbound logging to find out what's going on, but i am at a loss. Could be my ISP blocking my requests? If i switch back to google DNS (for example) all works fine, but using my Unbound will only resolve TLDs and some random names. For example, it will resolve google.com but not kde.org... Edit: somehow fixed by nuking config file and starting over.
fedilink

FitTrackee
If I remember correctly, FitTrackee Dev do post on this community. Well, I want to thank him/her as this is a very nice piece of software that I just started using but looks so promising and well done! A breeze to install, even on bare metal, and so well designed (even a CLI? Come on!). Looking forward to try Garmin integration tomorrow. Thank buddy!/Appreciated.
fedilink

Self-hosted diary
Looking for a self hosted diary type of service. Where I can login and write small topics, ideas, tag them and date them. No need for public access. Any recommendations? Edit: anybody using monicahq or has experience with it? Clarification: indeed I could use a general note taking app for this task. I already host and use silverbullet for general notes and such. I am looking at something more focused on daily events and connections. Like noting people met, sport activities and feedbacks, names, places... So tagging and date would be central, but as well as connections to calendar and contacts, and who knows what else... So I want to explore existing more advanced, more specialized apps. Edit2: I ended up with BookStack. MonicaHQ seems very nice but proved unable to install using containers. It would not obey APP_URL properly and would mess up constantly HTTP / HTTPS redirection. Community was unrepsonsive and apparently github issues are ignore lately. So i ditched MonicaHQ and switched to BookStack: installed in a breeze (again container) and a very simple NGINX setup just worked. I will be testing it out now.
fedilink

CalDAV web gui
Hi, Using radicale since I switched from next cloud, using dav5x on android pretty nicely. I was thinking about adding a web ui to access my calendars too from web... Any recommendations? Radicale web ui only manages accounts and stuff, not the calendars contents.
fedilink

Looking for a way to monitor my services
Hi! i have a mixed set of containers (a few, not too many) and bare-metal services (quite a few) and i would like to monitor them. I am using good old "monit" that monitors my network interfaces, filesystems status and traditional services (via pid files). It's not pretty, but get the work done. It seems i cannot find a way to have it also monitor my containers. Consider that i use podman and have a strict one service, one user policy (all containers are rootless). I also run "netdata" but i find it overwhelming, too much data, too much graphics, just too much for my needs. I need something that: - let me monitor service status - let me monitor containers status - let me restart services or containers (not mandatory, but preferred) - has a nice web GUI - the web gui is also mobile friendly (not mandatory, but appreciated) - Can print some history data (not manatory, but interesting) - Can monitor CPU usage (mandatory) - Can monitor filesystem usage (mandatory) I don't care for authentication features, since it will be behind a reverse proxy with HTTPS and proxy authentication already. I am not looking for a fancy and comples dashboard, but for something i can host on a secondary page that i open if/when i want to check stuff. Also, if the tool can be scripted or accessed via an API could be useful, so i would write some extractors to print something in a summary page in my own dashboard.
fedilink

My take on selfhosted photo management
I have spent quite a lot of time trying to find the *best* photo management solution for my use case, and i think i have finally got a solution in mind. Please follow me and help me understanding what could be improved. The use case: I took, over the decades, thousand of pictures with manual, film based SLR, digital DSLR and many other devices. Today i mostly only take pictures with my phone and occasionally (like 1-5 rolls per year) B/W film photos. I like to have all the pictures neatly organized per album. Albums are events, trips, occasion or just a collection of photos for any good reason together. I have always organized albums my folders and stored metadata either in the photo or in sidecar files. Over the decades i changed many management tools (the longest has been Digikam) but they all faded away for one reason or the other. I do not want to change organization since it proved solid over decades. I do not trust putting all eggs in a database or a proprietary tool format. The needs: backup photos from family phones. Organize photos in albums (format as stated above), share & show pictures with family (maybe broader public too), archive for long term availability. Possibly small edits like rotation. Face recognition is a good plus, geographical mapping and reverse geotagging is a great plus. General object recognition could be useful but not a noticeable plus. Also i need multi-user support for family members both on backup and gallery-like browsing. My galleries need to be all shared (or better one big gallery, plus individual backups for users) What i don't need: complex editing / elaboration (would be done offline with darktable) Non-negotiable needs: storing photos in album-based subfolders structure with all metadata inside photos or sidecar files. No other solution will ever stand the test of time. I tried many tools and none fits the bill. Here are my experiences: - Immich: by far the most polished, great for phone backup&sync, not good for album organization (photos cannot be sorted into folders, albums are logical only). Has the best face detection and reverse geocoding. - Photoprism: given up because i don't like open-source with money tags (devs have all the rights to ask for money, but i distrust a model where they might give up support unless they make money) - Librephoto: feels abandoned and UI & Face detection is subpar with immich - PiGallery2: blazing fast and great UI, but cannot be used for backups nor organization. But can cope well with my long lasting collections of photos. - Piwigo: i used this decades ago. By today standards feels ugly bloated and slow as hell. No benefits anyway for my use case that compensate slugginesh. And my server is **powerfull**. - Damselfly: great tool and super friendly dev, unfortunately i could not fit into my use case. It can work on folders, but it's actions are too limited and beside downloads and exports and tagging... not much else. Not even backups from phone. I understand it's use case is totally different from mine. Still a great piece of software. My solution: more of the idea of how i want to proceed from here on... Backup: keep the great Immich for phone backups. Limitations: requiring **emails** as user logins breaks my home server authentication scheme but i can live with it. The impossibility to organize photos in folders is a deal breaker but luckily, you can define "logical" albums and **download** them. Organization: good old filesystem stuff, i don't need any specific tools. Existing photos are already sorted in subfolders, new albums can be created from Immich, downloaded, and stored on new subfolders on the server. Non-phone albums (DSLR, film cameras...) can just be added as well directly on filesystem Viewing: PiGallery2 pointed at the subfolders, blazing fast viewing online for all family members. Global workflow: take photos from phones, upload automatically to immich, then manually go sort them in albums, download albums and create appropriate subfolders on the server (if needed to save space, delete downloaded photos from immich). Upload/unzip and enjoy from PiGallery2. -- OR -- take photos with other cameras, scan/process on PC (darktable), create appropriate subfolders on the server, upload and enjoy from PiGallery2. All in all what pisses me off of all this is: - Immich requiring a fucking **email** address to login (not a privacy concern here, but my users will need to remember a different login for this specific part) - Immich not supporting *subpaths*, i will need **two** subdomains to achieve this workflow, while just one would have been less complex for the users (something like photos.mydomain.org/gallery and photos.mydomain.org/backup, instead of photobackup.mydomain.org and photogallery.mydomain.org, you get the idea). I know all the blah blah on subdomains being better and such, i don't care, this is an usability issue for dumb users and, in general, it's the way i prefer it to be. Of course, the best course would be to have Immich support folders (not external libraries, but actually folder based albums which is totally different approach) and it being able to move photos to folders, but hey, it wouldn't be fun in that case :) Amy thoughts? UPDATE: Immich **storage templates** seems to be the missing link. Using that properly would cut out the manual download/reupload approach. Need to experiment a bit, but looks promising. -
fedilink

Web based markdown gui
I am setting up my notes approach which is using dedicated apps on my devices plus syncthing. I tried lots of tools like Joplin obsidian etc but are too overkill or had something I don't like. So I am using markor on android and another dedicated app on Linux and so on. I would like to add also a web app to edit the MD files directly on my server when I don't have any way to install syncthing or an editor app. The web GUI would need to list the MD files local on the server and let me edit/view/save them. Upload and download is not required as I already have that setup via filebrowser. Any hints? Edit: to be clear, i am not looking for an IDE or anything fancy, i only need to edit some notes online on my server. I do not want to spin containers or deploy full VS solutions just for this, all i need is a web gui editor for MD with the capability to load files on the server Second edit: i ended up selfhosting Silverbullet.md which made my day. Exactly what i was looking for, even more than that. Thanks all!
fedilink

Selfhost wiki (personal)
I have finally got my selfhost wiki up to a satisfying shape. Its here: https://wiki.gardiol.org Take a look i hope it can help somebody. I am open to any suggestions about it. Note: the most original part is the one about multi-homed routing and failbacks and advanced routing.
fedilink

Batch video conversion from command line
Hi fellow sailors, i have lots of downloaded... ISOs... that i need to converto to save space. I would like to make them all of the same size, let's say 720p, and same format, let's say h265. I am on linux and the... ISOs... are sorted into a neatly named hierarchy of sub-folderds, which i want to preserve too. What is the best tool (CLI) for the scope? I can use ffmpeg with a bash script, but is there anything better suited?
fedilink

Download an… iso… to find all files inside with strage names?
Let's say i download an iso for my latest favourite distro and, after unpacking the rar (usenet) i find the right contents but all the filenames are a bunch of hexadecimal strings. The files are legit, but how do i "decode" the names to know which one is file n.1, file n.2 and so on?
fedilink

Joplin alternative?
I use Joplin and I do like it very much, but I would like to be able to at least view (not edit) the notes from web browser... Which is not supported. Are there good alternatives that are: - fully open source - have android client - have web client or viewer - can be synched VOA WebDAV or native method I can also settle for a Joplin web viewer of sorts! UPDATE: i opened up a can of worms. I would have never tought there would be so many tools for this task, and so many different shades of how it can be done. Even excluding ALL the non-truly-FOSS solutions out there, there are still tons of tools with good points and bad points. Of course, NONE fits my bill so i will spin mine… Joking, i have no time for that. Using joplib-webview feels too much. Spinning containers just for that meh. Will try tough. The joplin .md files are only "sync" files, from which yo ucan probably extract the notes. But that would be not the best idea. Maybe some kind of link to Joplin terminal would be the way forward. I will see. I will stay on Joplin, it's the closest i could find to what i need, the only lacking is a web viewer, which i can live without for the time being after all. Thank you all, and to anybody still chiming in!
fedilink

Fighting with immich
After all the amazing reviews and post i read immich I decided to give it a try. To be honest I am quite impressed, it's fast and polished, it just works. But I found a few quirks, and hit a wall with the developer that doesn't seems kind to listen to users that much (on these issues at least!) Maybe you guys have suggestions? Here I go: One: it does not support base URLs, witch means that I had to spin a dedicated sub domain to be able to access it over internet while all my other services are on a single sub domain. I can work with that, but why. Dev already shut this request down in the past as "insecure". Which I find baffling. (I mean use mydomain/immich instead of immich.mydomain) Two: auth cannot be tied to reverse proxy. I get it, it provides OAuth. But it's much more complex than proxy based auth... And overkill for many cases, mine for sure. Three: impossible to disable authentication at all, which would just work fine in my use case. There is a switch that seems for that, but no, it's only for using OAuth. Four: I cannot find a way to browse by location, only by map. (Locations list seems to be half baked unless I am missing something). Five: no way to deploy on bare metal, and I tried! due to lack of documentation (only info I found where very very outdated), and no willingness to provide info about that either. Seems that docker is so much better that supporting bare metal is a waste of time. Six: basically impossible to manage easily public albums. like a public landing page. I get this might be outside immich scope. Seven: even if now you can import existing libraries, it still does not detect albums withinbthem (sub folders) which is very annoying. So, overall its a great project and very promising, faster and more reliable than Libre Photos in my use case, but still lacking some basic features that the Dev seems not interested in adding. He developed it to please his wife, I get it :) - no pun intended, doing all this take lots of time, I know. These are the alternatives I know of: Photo prism requires a subscription for reverse Geo coding. LibrePhotos feels sluggish and kind if abandoned. Are there any others? (Piwigo and Lytchee are great tools, but different kind of tools) Let's hope for immich, Dev is working a lit, let's hope for the best.
fedilink

Why docker
Hi! Question in the title. I get that its super easy to setup. But its really worthwhile to have something that: - runs everything as root (not many well built images with proper useranagement it seems) - you cannot really know which stuff is in the images: you must trust who built it - lots of mess in the system (mounts, fake networks, rules...) I always host on bare metal when I can, but sometimes (immich, I look at you!) Seems almost impossible. I get docker in a work environment, but on self hosted? Is it really worth while? I would like to hear your opinions fellow hosters.
fedilink

Tabula rasa
tabula rasa registration is open :::
fedilink

Updated my Gentoo guide to Sailing the High Seas
Well, i decided to brush up my simple HTML page and created a fully linked wiki on the subject. Please take a look, in the hope it will be useful for at least one fellow one-eyed leg-pegged passionate data hoarder. Any hints or suggestions is appreciated.
fedilink

Writing a guide
Hi all fellow sailors. After having spent lots of time and effort recently to properly setup my environment, i have put toghether (more for MY personal future reference than anything) the guide linked to this post. It's not done, a few things are missing, but i hope it could be usefull also for more people in the future. the link is: https://www.paneburroezucchero.info/sailing.html Cheers!
fedilink

Sailing Arr seas and the Usenet oceans
One month ago I decided to give the 'Arrs a chance and, while there are issues and limits, i'am loving them. But I have an issue: at home I have only internet access trough a 5G mobile network connection which means zero opportunity to have port forwarding or open ports at all. This rules out private torrent trackers (tried a couple, no luck in getting any ratio ofc). Public torrent trackers being basically shit, I decided to give Usenet a try, and two things happened: 1. I started loving it! 2. I discovered I have a 1tb/months full band with cap on my home connection. After that from 200mb/s I get dropped to 6mb/s this time, unlimited bandwidth. I have a few suggestions first for newcomers: 'Arrs: start using them NOW. Also, they will help you organize your existing library, but be aware that doing a good job is not only mandatory but also time-consuming. Also, get JellyFin and it will play along with your organized (-- imean it) collection nicely. Make sure you set proper umask and group (media management/advanced settings for each arr app) do that the entire stack andbl jellyfin can write into your media collection: this will reduce issues with metadata sync a lot. Get bazarr working with subscene! And setup a nice nginx reverse proxy for the entire stack. Some issues I ran into: Readarr really has issues with finding stuff and specially with audio books. Anybody could help me out here? Lidarr seems always to go to torrent, which get stuck with no seeders for me. Is there music on Usenet? Now to the last part: Usenet! That changed my entire game. As movies and TV series, I can literally find anything fast and saturated my 1tb plan in two days. I have newshosting and recently got eweka for less than 4€/month. Don't get caught in the common lie of three months free: they always charge 15 month immediately so you cannot really test them out then cancel. As indexers I got NZBGeek and I am planning to seek out DrunkenSlug. Any suggestions here? (I know newshosting and eweka are probably overlapping, getting both was a mistake, but a relatively cheap one) One last question: audiobooks and music on Usenet: what is your experience? One truly last question: any way to integrate soulseek (nicotine+) on the arr stack? Thanks fellow sailors.
fedilink

Usenet
So, i want to give a try to usenet world. I have the *Arr setup with torrents now, and works great, but why not experiment a bit further? So many questions... First of all: will i find italian contents? Or its all US/English stuff? I got an account on NZBFind, bit thats only an indexer right? I need also to find an access provider to usenet? If so, which access provider do you recomend? I want somewhere i can test for a few weeks or a month or so before committing, so no credit card upfront. Update: trough techradar got an offer on newshosting for 5.99/mo with 3 monts free. Paid with PayPal then cancelled the recurrent immediately (so paid 0 so far). But at the moment not yet sure i made it working in my *Arr setup properly.
fedilink

*Arr stack and tips
Recently setup the *Arr stack&transmission+ JellyFin and loving it... (linux user, with always on server at home, behind a 5G unlimited connection) I am a torrent person, dont understand usenet at the moment. A few questions to fellow sailors: - how do you safely and easily import your existing libraries? (Movies tv.shows books..) - how do you manage multiple languages.in movies? Like having the movie in french and english both - where and how to search for audiobooks? Really cant find many... Any tips for daily usage?
fedilink

download videos
Hi fellow people! I want to download & save a huge set of small videos (20min each) for a swimming course I paid for and they don't let me download , and this is driving me crazy. The videos can be accessed via browser only, and I tried tubedowoader with no success (gives an error "empty keys" while attempting to download). What should I try? (Android is first choice, then Linux ofc)
fedilink