this post was submitted on 26 Oct 2023
100 points (93.9% liked)
Technology
59288 readers
3941 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Not familiar with the tech, but wouldn't server-side LLMs still have an advantage regardless because of the greater power available on tap? Anything that improves local LLM will also benefit server-side LLMs, wouldn't it?
Not necessarily, as it gets faster the latency between your local and remote machines becomes a bigger fraction of the time taken to process anything. If your local machine processes in 50ms and the remote machine in 5s, a latency of just 45ms would make your machine faster.
Running locally also cuts out a lot of potential security issues inherent to sending data over a network, and not sending your data to a third party is a bonus too.
Possibly, but given the choice between paying $20/m for a marginally better version of something that’s free and probably built in to your editor at that point, most people would probably take the free thing. At that point paid llms will need to find new niches beyond simply existing.