I’m trying to be more mindful about my YouTube consumption, there are a lot of quality channels out there, but sticking to “subscriptions” is difficult when the YouTube app on my TV has so much distracting recommended content and shorts thrown at you, so I’d like to have a way to auto-download the content from specific channels to play later via Plex. I actually have YT Premium but plan on putting the money into the Patreons of my most-watched creators instead.
Features I’m looking for:
Things I’ve looked into:
Anything I’m missing or are these basically the main options for now? Would love something as simple as Sonarr.
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don’t control.
Rules:
Be civil: we’re here to support and learn from one another. Insults won’t be tolerated. Flame wars are frowned upon.
No spam posting.
Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it’s not obvious why your post topic revolves around selfhosting, please include details to make it clear.
Don’t duplicate the full text of your blog or github here. Just post the link for folks to click.
Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).
No trolling.
Resources:
Any issues on the community? Report it using the report flag.
Questions? DM the mods!
Tubearchivist works great for me. Downloader, database and player, all in one. Even integration with jellyfin is possible, not sure about plex though.
I’ve tried installing it via docker a hundred times and can never get it working. Maybe I’ll revisit it some day, but it also seems like more than I need and the server it’s running on isn’t super powerful.
Yeh Tubearchivist is a beast. A python Django app, Elasticsearch and Redis. It uses like 1-2 GB of ram for me. And although it works and has no issues for me as I’m quite good with docker and coupling apps together, it’s always scary upgrading it. Broken redis-server many times, but seems to recover after deleting it and setting the settings again.
https://github.com/Tzahi12345/YoutubeDL-Material
It does support sponsorblock. You need to add a tag within the command and it’ll apply it to all new videos.
Also, if you use docker, I can help you with the installation.
Download this file: https://github.com/Tzahi12345/YoutubeDL-Material/blob/master/docker-compose.yml
Go to terminal and navigate to the folder where you downloaded the file.
Copy this code and hit enter: docker-compose pull
Copy this code and hit enter: docker-compose up
Wait for 15 minutes and in the container log
Try both ports and see if it works 8998, 17442
2nd’ing. I tried TubeArchivist too and preferred this
I can’t seem to not get this error:
EDIT: Just had to wait the fifteen minutes like you suggested!
Glad you got it working.
I’ve noticed it need a lot of time to start even with good hardware and the confusing part it’ll throw errors rather than doing nothing.
Now to configure sponsorblock, go to settings > advance > select downloader: choose yt-dlp
Then go to settings > downloader > global custom args type this: --sponsorblock-remove,all
Now, for all new videos, it will download the video and remove all sponsorblock marked segments.
Hi me again, sorry to use you for tech support, but I’m getting this error:
No problem. I think you missed a comma before all. It should look like this:
–sponsorblock-remove,all
Edit: I don’t know why lemmy client I use remove the double comma.
ohhhh that’s funny thanks!
Awesome thanks, I think I’m going with this! The UI is good. I’m struggling to find a list of all sponsorblock args, know of anywhere?
Also do you happen to know what Plex scraper is best for YoutubeDL-Material’s default naming scheme?
Sure, you can find the documentation under sponsorblock options here: https://github.com/yt-dlp/yt-dlp
For categories, maybe sponsorblock github is the best place to find them.
Unfortunately, I don’t know how to scrape information or import it to Plex. Also, check out jellyfin as an alternative of Plex.
For Plex:
In YouTube download-material settings: Extra-> check “generate nfo files”
In plex, create a library called “YouTube” or whatever you’d like, category set to “other videos”. Use scanner: “Plex Video Files Scanner” set agent to “Personal Media.”
Under Plex settings “Agents” make sure under both the “movies” and “shows” tabs that the “personal media” agent is set to use “Local Media Assets” and that that is top priority.
Plex will use the nfo files generated by ytdl for metadata.
I believe there is a dedicated YouTube series agent, but those can be finicky. This way, ytdl has already done all the metadata work.
Thanks for this, this is the only breakdown I’ve been able to get (sort of) working.
Just to clarify- this won’t load thumbnails into Plex, correct?
Yeah, it won’t. Only Plex’s own screengrabs.
Ytdl-sub is what you want. Does allmost all of that. Has sets up nfo and plex. Really great app. Takes a bit to config but the dev is ver responsive and is on discord all the time.
This looks good thanks, not sure how I missed it! I wish there was a GUI for configuring, but other than that it looks great.
There are people working on the gui. But honestly once it clicks its easy street. I highly suggest the discord. A ton of people running it for a while and very helpful. Say hi to codeslave there. (Me)
Wonderful. I’m gonna have a look at that, I’ve been hoping to find something that works with sponsorblock
On mobile you could also have a look at NewPipe. It does not have automated downloads but it shows you a simple list of all the videos from your subscriptions without any algorithm-based recommendations. It shows no ads and is fully open source.
Also supports plenty of services other than YouTube.
Newpipe is great at what it does, but I’m trying not to watch YT anywhere except on TV to avoid doing it mindlessly.
yt-dlp can take care of the downloading part. Just have a script that checks a list of channels for new content and make a cron job for it