242
this post was submitted on 03 Dec 2024
242 points (97.6% liked)
Technology
60102 readers
3737 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Yeah, 40 is just not for me. I rather go 1080p and hopefully get 75+ FPS. It's really hard to go back from that to something as choppy as 40, even 60 feels kinda bad now.
And yes, I use local LLMs too and 8 GB vram is kinda painful and limiting, though the biggest hurdle is still rocm & python which are an absolute mess. I'd love to get even more than 16 GB but that's usually for the high end segments and gets real pricey real quick.
Linux and me playing a lot of indie titles is also why I'd still avoid Intel, even if they had something in the upper midrange, but I still would've loved to see some competition in that area because then AMD would have to also deliver with their prices and that'd be good for me.
Yeah, 40 isn't great, but I play a lot of Switch games, and 40 is generally a good framerate for those. But I definitely notice it when switching between new AAA and indie/older games.
Intel could've earned my business by making up for mediocre performance with a ton of VRAM so I could tinker w/ LLMs between games. But no, I guess I'll stick w/ my current card until I can't even get 40 FPS reliably.