• 1 Post
  • 4 Comments
Joined 1Y ago
cake
Cake day: Jun 13, 2023

help-circle
rss

I’ll be the heretic here, but so far as I know you are only required to make source available when you distribute binaries. And for that matter, it doesn’t even have to be online just available upon request unless you’re using a derivative GPL that added online access as a clause.

I highly doubt the users of a web interface are required to be given access to source. There are multiple GPL-licensed web servers (I am well aware Apache is not btw) and I’ve never seen one embed a code link on every page.

Tl;Dr: Lemmy does it, but I believe it’s not required. Modify away if you so choose.


Yes.

And to some of the child replies, I think there’s a question of scale that often gets overlooked. In all these discussions, there seems to be two different groups commingling: ones who just need 1-2 simultaneous streams, and ones who are doing true whole-house-plus systems.

I’m serving subtitles-enabled streams to (mostly) Roku clients - who need the server to burn in the subtitle track for some insane reason. It’s nothing for my Plexbox to be serving 6 simultaneous streams. A 4790K would definitely not cut it for me.


Honestly, don’t bother with a dGPU and get a 12th or 13th gen Intel Core chip with QSV. Intel quietly tuned it up to the point where it’s faster than nVidia’s NVENC engine even in the latest gen plus you don’t have mess around with the uncap streams hack and you’re transcoding through system RAM not dGPU RAM, so far less likely that your stream limit will be artificially constrained by memory limitations.

To answer the question you asked though, the nVidia NVENC is the best solution on a dGPU. It’s performance is largely the same across the same board generation, with one exception in the GTX 10X0 series. The absolute cheapest card you can lay your hands on that has an NVENC engine is the 1050TI.

The caveat is the 1070 and 1080 have two NVENC engines. It will double max number of streams in theory, however in reality you’re memory bound on those cards and it’s more like a 33% bump.


I did. I could never get ansible to work when I was setting up the same machine. If you know how to set the inventory file up for that, I’m all ears.


Help standing up a self-hosted Lemmy instance
I'm trying to stand up a Lemmy instance, and for some reason I'm just not getting it. I've got a fair bit of experience in Linux and Docker. NPM is new to me, but doesn't seem difficult. I've looked over several walkthroughs but it seems like they all don't quite work right. Does someone have a clear step-by-step that works, or could take the time to remote in and help me get this up? I'm running on VMWare ESXi, and I've tried both Debian and Ubuntu to get the server up. Closest I got, the Docker containers would start but seem to be throwing errors internally and don't connect to one another.
fedilink