this post was submitted on 14 Jul 2024
487 points (96.9% liked)
Technology
59672 readers
3194 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I mean they aren't wrong. From an efficiency standpoint, current AI is like using a 350hp car engine to turn a childs rock tumbler, or spin art thingy. Sure, it produces some interesting outputs, at the cost of way too much energy for what is being done. That is the current scenario of using generalized compute or even high end GPUs for AI.
Best I can tell is, the "way forward" is further development of ASICs that are specific to the model being run. This should increase efficiency, decrease the ecological impact (less electricity usage) and free up silicon and components, possibly decreasing price and increasing availablity of things like consumer graphics cards again (but I won't hold my breath for that part).