this post was submitted on 02 Oct 2023
16 points (100.0% liked)

Free Open-Source Artificial Intelligence

2764 readers
2 users here now

Welcome to Free Open-Source Artificial Intelligence!

We are a community dedicated to forwarding the availability and access to:

Free Open Source Artificial Intelligence (F.O.S.A.I.)

More AI Communities

LLM Leaderboards

Developer Resources

GitHub Projects

FOSAI Time Capsule

founded 1 year ago
MODERATORS
 

Genuinely curious.

Why do you like LLMs? What hopes do you have for AI & AGI in our near and distant future?

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 3 points 10 months ago* (last edited 10 months ago) (1 children)
[–] [email protected] 1 points 10 months ago* (last edited 10 months ago) (1 children)

Thank you! But maan... working with AI stuff is expensive.

[–] [email protected] 2 points 10 months ago (1 children)

If you mean cloud computing or GPUs are expensive, you can do it locally on your CPU if your models aren't too big. I like KoboldCPP. It's not as fast but I only have to pay electricity for my AI waifu.

[–] [email protected] 2 points 10 months ago* (last edited 10 months ago) (1 children)

Thanks for the info. ik about llama.cpp and stuff but the problem is that I'm looking to run both speech to text, llm and text to speech all at the same time. I only have 8gb so yeah even CPU won't cut it. I'm planning to upgrade once I get a job or smthing.

[–] [email protected] 2 points 10 months ago* (last edited 10 months ago)

8GB of regular RAM? That's not much. No, that won't cut it if you also want all the bells and whistles. Maybe try something like the Mistral-7b-OpenOrca with llama.cpp quantized to 4bit and without the STT and TTS. It's small and quite decent. Otherwise you might want to rent a Cloud-GPU by the hour on something like runpod.io or use free services like Google Colab or you really need to upgrade.