• 3 Posts
  • 36 Comments
Joined 6M ago
cake
Cake day: Mar 14, 2024

help-circle
rss
Selfhosted Piped instance loading infinitely
cross-posted from: https://discuss.tchncs.de/post/21001865 > I just installed Piped using `podman-compose` but when open up the frontend in my browser, the trending page is just showing the loading icon. The logs aren't really helping, the only error is in `piped-backend`: > > ``` > java.net.SocketTimeoutException: timeout > at okhttp3.internal.http2.Http2Stream$StreamTimeout.newTimeoutException(Http2Stream.kt:675) > at okhttp3.internal.http2.Http2Stream$StreamTimeout.exitAndThrowIfTimedOut(Http2Stream.kt:684) > at okhttp3.internal.http2.Http2Stream.takeHeaders(Http2Stream.kt:143) > at okhttp3.internal.http2.Http2ExchangeCodec.readResponseHeaders(Http2ExchangeCodec.kt:97) > at okhttp3.internal.connection.Exchange.readResponseHeaders(Exchange.kt:110) > at okhttp3.internal.http.CallServerInterceptor.intercept(CallServerInterceptor.kt:93) > at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109) > at okhttp3.internal.connection.ConnectInterceptor.intercept(ConnectInterceptor.kt:34) > at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109) > at okhttp3.internal.cache.CacheInterceptor.intercept(CacheInterceptor.kt:95) > at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109) > at okhttp3.internal.http.BridgeInterceptor.intercept(BridgeInterceptor.kt:83) > at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109) > at okhttp3.internal.http.RetryAndFollowUpInterceptor.intercept(RetryAndFollowUpInterceptor.kt:76) > at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109) > at okhttp3.internal.connection.RealCall.getResponseWithInterceptorChain$okhttp(RealCall.kt:201) > at okhttp3.internal.connection.RealCall.execute(RealCall.kt:154) > at me.kavin.piped.utils.RequestUtils.getJsonNode(RequestUtils.java:34) > at me.kavin.piped.utils.matrix.SyncRunner.run(SyncRunner.java:97) > at java.base/java.lang.VirtualThread.run(VirtualThread.java:329) > ``` > > Would appreciate it if anyone could help me. I also wasn't sure what info to include, so please ask if there's any more info you need.
fedilink

Hab ich tatsächlich schon gemacht, aber iwie funktioniert das noch nicht ganz. Weil ich selber nicht rausfinden konnte warum, hab ich grad nen post gemacht wo ich nach Hilfe frage.


Wegen dem letzten Part, es geht zwar schon um privacy, aber dadurch, dass ich momentan YouTube direkt nutzen muss, weil die Piped instances nicht funktionieren, ist es halt immer noch besser. Außerdem hab ich angefangen nen GTK 4 Piped client zu schreiben und ich muss das halt auch iwie testen können.


Thanks, I’m gonna selfhost it then. What you said about the piped components sounds interesting, is there like a list of them?


Are selfhosted Piped instances still working?
All the public Piped instances are getting blocked by YouTube but do small selfhosted instances, that are only used by a handful of users or just yourself, still working? Thinking of just selfhosting it. On a side note, if I do it, I'd also like to install the new EFY redesign or is that branch too far behind? Edit: As you can see in the replies, private instances still work. I also found the instructions for running the new EFY redesign [here](https://efy.ooo/#faq#piped_instance)
fedilink



If I was just downloading stuff, I wouldn’t be using a VPN. Not sure if it’s illegal but it’s just not worth it for the police to go after people who just download. With torrents, due to the way they work, you’re also uploading. That’s the reason you should use a VPN when downloading with torrents.



The only way to find out it was you, would be to ask the VPN provider. Mullvad has a perfect track record of not keeping logs tho, so it’s very unlikely they’re gonna get anything from them. All that work wouldn’t even be worth it for someone just downloading some music like you do.

I have my torrent client running 24/7 connected to a different VPN to my home country, Germany, as well and nothing’s ever happened, even though Germany is pretty strict when it comes to this stuff.


I’ve wanted to contribute code to open source projects for years at this point but looking at the code just seems so daunting. I’ve only contributed things like icons, translations and map data in OSM. I did start working at my first job this week as a software developer, so I hope I’ll get more experience in working with existing projects.


I have Fedora installed on my system (don’t know how the situation is on other distros regarding rocm) and my GPU is an RX 6700 XT. For image generation I use stable duffusion webui and for LLMs I use text generation webui. Both installed everything they needed by themselves and work perfectly fine on my AMD GPU. I can also give you more info if there’s anything else you wanna know.


I actually use an AMD card for running image generation and LLMs on my PC on Linux. It’s actually not hard to set up.


Lemmy should really add alt texts like Mastodon has


I use Grocy for the shopping list feature. It has a lot more functionality tho.


1337 is fine for most stuff, I think. Private trackers start to make sense when you want to automate downloading shows and movies but if you just wanna pirate some game, you’ll probably find it on 1337 with a ton of seeders anyways.


You know what they say, something is better than nothing


I’m Gen Z and I still know all this stuff because that’s just what I’m interested in. I don’t think it’s a huge issue that those things were made simpler for the average person and that they don’t know how it works. It’s not like you can or need to know everything.


It’s not that you’re missing a font but rather that your font is missing those symbols. They’re not just regular letters. Here’s what it should look like:


That’s what I meant with at the beginning. As soon as I started making more profit than losses, it became very easy, which also made it kind of boring.


I actually really struggled with making money at the beginning, how did you do this?


Yeah, I voted for the Pirate Party but I’m probably gonna vote for Die Linke (eng.: The Left) next election


I have Home Assistant running with TTS and STT on a mini PC with an Intel N100 CPU and 16 gigs of RAM. Works great. LLMs and Stable Diffusion need way more procesing power and RAM (or rather VRAM cause both are very slow without a GPU), so that mini PC wouldn’t be enough for that tho.


It’s still one of the, if not the, best headsets available. PSVR 2 is the only headset I know of that is better in a lot of areas than the Index but the Index definitely still has the best controllers by far.


But, over the last four years, CDPR has been able to turn the ship around with free updates, and DLC.

Why is it called free updates nowadays? Updates are always free, no?



If it’s not available as an application, you should probably look into docker compose


What I’m using is Text Generation WebUI with an 11B GGUF model from Huggingface. I offloaded all layers to the GPU, which uses about 9GB of VRAM. With GGUF models, you can choose how many layers to offload to the GPU, so it uses less VRAM. Layers that aren’t offloaded use system RAM and the CPU, which will be slower.


15 games for $70 are enough to save $1000, which is definitely enough for a good gaming PC. After buying a PS5 and the cheapest PS Plus subscription, paid yearly (cause that’s the cheapest option per month) for a little more than 8 years, you’re also at $1000. With the most expensive PS Plus option it would only take a little more than 4 years.


In GNOME you just need to log in with your Nextcloud account in the system settings and it will add it in the file manager


There’s a project called Watchtower that is specifically for auto-updating docker-compose containers



That’s weird, I never had issues either. Did you check the box at the beginning so it doesn’t use more than 4gb RAM?


It’s just that I only read manga on my phone anyway (even though that might be because it’s not synced between devices) and I’ve never had the issue that an online source went offline. I just thought that maybe there are other reasons, like how you can get way better quality when you self host something like Jellyfin for movies and shows.


Any advantage to using something self-hosted, like Komga, in Mihon (formerly Tachiyomi)?
I recently found out that instead of just using online sources, you can also use something you can host yourself, like Komga, in Mihon. I'm just wondering if there's an advantage to it that I didn't think of because the only things I can think of are: - Progress is synced over multiple devices - Online sources can suddenly go offline, your self-hosted service won't
fedilink


Can’t wait for forge federation, it’s super annoying that I need an account for each individual instance just to report a bug


A mini-pc with an Intel N100 will be a little more expensive (I bought one for ~150€) but it’s about 5-6 times faster than the Pi and mine also came with 16gb of RAM and a 500gb SSD. It requires very little power and because of that, it’s also very quiet. AV1 decode is also great if you plan to run something like Kodi on it or you want to do transcoding from an AV1 video with Jellyfin (I haven’t migrated those to it yet, so I don’t know how well it works in practice). I’m not sure but it might not even be a lot more expensive than a Pi with 8gb of RAM and an additional 500gb SSD.


You just need the docker and docker-compose packages. You make a docker-compose.yml file and there you define all settings for the container (image, ports, volumes, …). Then you run docker-compose up -d in the directory where that file is located and it will automatically create the docker container and run it with the settings you defined. If you make changes to the file and run the command again, it will update the container to use the new settings. In this command docker-compose is just the software that allows you to do all this with the docker-compose.yml file, up means it’s bringing the container up (which means starting it) and -d is for detached, so it does that in the background (it will still tell you in the terminal what it’s doing while creating the container).