this post was submitted on 30 Jan 2024
71 points (94.9% liked)
ChatGPT
8940 readers
1 users here now
Unofficial ChatGPT community to discuss anything ChatGPT
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
If you have a GPU in your pc it’s almost always faster to just run your own llm locally. And you won’t have this issue. Search for ollama.