removed by mod
fedilink

regarding the pricy enclosures, there are vastly cheaper eGPU solutions especially if you’re able to utilise the on-board M.2 or mini-PCI slot. if you don’t move the laptop around, it’s a viable option. this would be an example - not an endorsment. you’d need a $15 PSU to power the graphics and it works well in linux, with the hotpluggability being the primary issue; if you’re willing to shutdown before attaching the eGPU, close to no issues.

you can run it as graphics card (i.e. utilize its display outputs) or just use the laptop’s display with optionally switching between the onboard and discrete graphics.

@Boomkop3@reddthat.com
creator
link
fedilink
English
1
edit-2
1M

The M40 doesn’t have outputs :p
But that adapter looks nice! Thank you!

If you’ve got a thunderbolt port on your laptop and a thunderbolt dock on your laptop then there’s no reason why it shouldn’t work.

I’m not familiar with thunderbolt on linux, but on windows you plug it in and it just works™️ and shows up as if it was inside your machine. Your DE on linux might automatically do it, but if you’re command line only you’ll probably have to run a command first.

@Boomkop3@reddthat.com
creator
link
fedilink
English
11M

I did some more searching, and found that nvme to pci-e adapters are affordable. That’s going to look a bit janky, but fortunately I don’t care.

Thank you again for the suggestion!

@Boomkop3@reddthat.com
creator
link
fedilink
English
12M

worth a shot then, worst case I return the dock thing

hendrik
link
fedilink
English
3
edit-2
2M

I did some quick googling. Are those thunderbolt docks really $350 ? That’s like half the price of a cheap computer?!

@Boomkop3@reddthat.com
creator
link
fedilink
English
22M

That can’t be
googling …
holy damn you’re right

hendrik
link
fedilink
English
1
edit-2
2M

Maybe you should do the maths on other options. You could get a refurbished PC for $350. Or buy the dock anyways. Or spend the money on cloud compute if you’re just occasionally using AI. Idk.

@Boomkop3@reddthat.com
creator
link
fedilink
English
31M

I did not say occasionally. We use AI a lot. Currently it’s mostly for image indexing, recognition, object detection, and audio transcription. But we’re planning to expand to more, and we’d like to use models that are more accurate

hendrik
link
fedilink
English
1
edit-2
1M

Fair enough. Yes I figured you probably wouldn’t have a M40 lying around by accident 😅

@Boomkop3@reddthat.com
creator
link
fedilink
English
21M

Oh it kinda just got here somehow. I don’t know why, all I did was buy the thing

poVoq
link
fedilink
English
12M

There are external GPU cases that might work with your laptop, but at least on older models these were relatively bandwidth limited which doesn’t matter that much for gaming, but I guess it might cause more problems with AI workloads? On the other hand, maybe not if the model fits completely into the vRAM of the m40?

Possibly linux
link
fedilink
English
21M

By the time you buy the enclosure you can get another old computer. Pickup a old workstation and put the GPU in it. Be mindful of power requirements

Create a post

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don’t control.

Rules:

  1. Be civil: we’re here to support and learn from one another. Insults won’t be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it’s not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don’t duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

  • 1 user online
  • 135 users / day
  • 395 users / week
  • 1.02K users / month
  • 3.76K users / 6 months
  • 1 subscriber
  • 3.89K Posts
  • 79K Comments
  • Modlog