Hello! This might not be the right community but I’m looking at building a new server and you operate in similar areas and might have experience with this. I’ve got a GPU without video outputs, can I combine that with a CPU without integrated graphics and still get video output from the HDMI located on the motherboard?
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don’t control.
Rules:
Be civil: we’re here to support and learn from one another. Insults won’t be tolerated. Flame wars are frowned upon.
No spam posting.
Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it’s not obvious why your post topic revolves around selfhosting, please include details to make it clear.
Don’t duplicate the full text of your blog or github here. Just post the link for folks to click.
Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).
No trolling.
Resources:
Any issues on the community? Report it using the report flag.
Questions? DM the mods!
deleted by creator
Probably not!
What models of GPU and Motherboard are you using?
I’ve got an Nvidia Tesla P40 and haven’t purchased a motherboard yet. It’s currently sitting and doing nothing in my DL380.
Do you want to not use your DL380? IF no it might make a good moonlight host!
My DL380 draws about 200W idle so I’m trying to downscale
Without specific experience, my assumption would be no. Much like when plugging into a desktop computer’s motherboard HDMI port instead of the GPU HDMI port.
No. The video card is only wired to send video out through it’s ports (which don’t exist) and the ports on the motherboard are wired to go to the nonexistent iGPU on the CPU.
Depends. You can send the signal in Windows through another port.
But if it works without an iGPU…
In windows you’re not sending the signal directly through another port. You’re sending the dGPU’s signal through the iGPU to get to the port.
On a laptop with nvidia optimus or AMD’s equivalent you can see the increased iGPU usage even though the dGPU is doing the heavy lifting. it’s about 30% usage on my 11th gen i9’s iGPU routing the 3080s video out to my 4k display.
In that case nevermind.
Carry on.
I just did a quick bing chat search (“does DRI_PRIME work on systems without a cpu with integrated graphics?”) and it says it will work. I can’t check for you because my CPUs all have graphics.
I CAN tell you that some motherboards will support it (my ASUS does) and some don’t (my MSI).
BTW, I’m talking about Linux. If you’re using Windows, there’s a whole series of hoops you have to jump through. LTT did a video a while back.
While it might work in the OS, setting the OS up may be a pain (the installer may or may not work like that) and I strongly suspect that the BIOS can’t handle it.
I suspect that an easier route would be to use a cheap, maybe older, low-end graphics card for the video output and then using DRI_PRIME with that with the other graphics card.
It’s probably a pain to set up in Windows. In Linux, it just works, there’s nothing to set up. I’m using it right now.
OP really should have mentioned their OS.
Edit: Actually, nevermind both my posts. I know DRI_PRIME works by using my APU for regular desktop activity, and routing discrete GPU output in whenever a game is being played. But I don’t know if it’s possible to make it use the dGPU all the time.
Even if it did, it would only work inside the OS, so if you had to boot into the BIOS for anything, you wouldn’t have a display. So for all intents and purposes, it wouldn’t really work.
This might be an X/Y problem. Why do you think you need HDMI output on a server?
Because installing an OS without iLo, serial or video output would be a bit of a hassle