I have experience in running servers, but I would like to know if it’s possible to do it, I just need a GPT 3.5 like private LLM running.

@TheBigBrother@lemmy.world
creator
link
fedilink
English
-53M

Yeah, I did see something related to what you mentioned and I was quite interested. What about quantized models?

@MasterNerd@lemm.ee
link
fedilink
English
23M

I don’t have any experience with them honestly so I can’t help you there

@TheBigBrother@lemmy.world
creator
link
fedilink
English
-53M

Appreciate you 👍👍

Quantized with more parameters is generally better than floating point with fewer parameters. If you can squeeze a 14b parameter model down to a 4-bit int quantization it’ll still generally outperform a 16-bit Floating Point 7b parameter equivalent.

@TheBigBrother@lemmy.world
creator
link
fedilink
English
-73M

Interesting information mate, I’m documenting myself into the subject, thx for the help 👍👍

Create a post

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don’t control.

Rules:

  1. Be civil: we’re here to support and learn from one another. Insults won’t be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it’s not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don’t duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

  • 1 user online
  • 279 users / day
  • 589 users / week
  • 1.34K users / month
  • 4.55K users / 6 months
  • 1 subscriber
  • 3.49K Posts
  • 69.8K Comments
  • Modlog