Hello yall, currently I have an RTX 2060, which I’ll be passing down to slap a 1060 into my server, but I’d like to weigh some options first.

The 2060 has been pretty good with Linux thus far, I’m a little worried about going to the 30 series - so I’ll be accepting affirmations - but I am curious what any of you think about AMD cards and which one to get. Also if there’s any reason not to use a 1060 for jellyfin and such that would be very helpful

Edit: thanks yall! Settled on an RX6600, runs local LLMs like nothing compared to my ol 2060

Nvidia drivers work the same now as a few years ago.

If you don’t like it, migrate.

If you don’t mind it, keep it.

@7toed@midwest.social
creator
link
fedilink
English
1
edit-2
16d

555 drivers have given me a couple issues on my main PC, I’ve heard of some issues with newer cards but I figure its just a matter of time before better drivers. I haven’t had to touch pci passthroughs yet and I was unsure if theres more a challenge on server hardware with Nvidia, but it sounds like I’ll be happy to have CUDA

@filister@lemmy.world
link
fedilink
English
716d

What are you going to use this GPU for. Simply for playing and you don’t care about ray tracing AMD is king. Or if you find a deal on Intel.

Self hosting LLMs and hobby AI/ML projects, NVIDIA.

Blender - NVIDIA

Internet Streaming - NVIDIA

Video editing - NVIDIA

Plex/Jellyfin - Intel

Unfortunately in most cases NVIDIA is still the king.

Check this link that will give you some ideas about the different GPUs: https://www.tomshardware.com/reviews/best-gpus,4380.html

@7toed@midwest.social
creator
link
fedilink
English
115d

Kinda lowballed and got an RX6600, honestly pretty chuffed how well it runs LLMs compared to my 2060. Still last gen, but admittedly I’ve only been playing factorio as of late, and Im not so graphics focused

@InverseParallax@lemmy.world
link
fedilink
English
5
edit-2
16d

Amd knocks Nvidia into a hat on Linux, the drivers are just too incredible.

With the exception of AI, where Nvidia is just plain the gold standard.

Intel is fine, it has exceptional video encoding and works.

@7toed@midwest.social
creator
link
fedilink
English
216d

I had an R5 1600x basically since it released, and I appreciate the socket lasted so long. I might as well go full AMD at this point, you’ve convinced me

Yeah, the ryzens are great too.

Full amd will treat you well, I’m running dual xeons and a Radeon pro with an arc 770 just for av1 encode right now.

Next round going full epyc.

If you have an Intel CPU with quicksync, it will likely perform better than the 1060 in terms of visual quality, if its coffee lake or newer (8th gen).

If not, well, it’ll be fine up to whatever the stream limit is (4?).

@7toed@midwest.social
creator
link
fedilink
English
116d

Xeon E5-2640 unfortunately does not, though still an upgrade of what I have now. Stream limit seems entirely configurable, so that will be just a matter of stress testing.

Won’t lie I always forget about all the new CPU hardware acceleration after using decade old hardware for so long 😅

Yeah quicksync won’t help you there.

I thought nVidia’s limit was enforced by their drivers, but that’s probably changed since it’s been a while since I looked at nvenc as a solution (quicksync, then an ARC card over here).

Scrubbles
link
fedilink
716d

NVidia is great in a server, drivers are a pain but do-able. I have a 3000 series that I use regularly and pass into my kubernetes cluster. NVidia on a gaming rig linux is fine, but there is more overhead with the drivers.

AMD is great in gaming servers, but doesn’t have CUDA, so it’s not as useful in a server environment in my experience - if you’re thinking of doing CUDA workloads like hosting LLMs.

1060 will be a noticeable step in Jellyfin

@7toed@midwest.social
creator
link
fedilink
English
216d

I didn’t even realize CUDA had a weight on LLMs, thank you!

Scrubbles
link
fedilink
316d

Oh yeah, critical component. And vram, in fact I would only consider LLMs on a 3000+ card right now, they require quite a bit of vram

Create a post

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don’t control.

Rules:

  1. Be civil: we’re here to support and learn from one another. Insults won’t be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it’s not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don’t duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

  • 1 user online
  • 230 users / day
  • 632 users / week
  • 1.4K users / month
  • 3.93K users / 6 months
  • 1 subscriber
  • 3.78K Posts
  • 76.6K Comments
  • Modlog