this post was submitted on 24 Jun 2024
-16 points (38.2% liked)

Selfhosted

38789 readers
367 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 1 year ago
MODERATORS
 

I have experience in running servers, but I would like to know if it's possible to do it, I just need a GPT 3.5 like private LLM running.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 5 points 1 month ago (1 children)

Look into ollama. It shouldn't be an issue if you stick to 7b parameter models

[–] [email protected] -5 points 1 month ago (2 children)

Yeah, I did see something related to what you mentioned and I was quite interested. What about quantized models?

[–] [email protected] 3 points 1 month ago (1 children)

Quantized with more parameters is generally better than floating point with fewer parameters. If you can squeeze a 14b parameter model down to a 4-bit int quantization it'll still generally outperform a 16-bit Floating Point 7b parameter equivalent.

[–] [email protected] 2 points 1 month ago (1 children)

I don't have any experience with them honestly so I can't help you there

[–] [email protected] -5 points 1 month ago

Appreciate you 👍👍