this post was submitted on 26 Jun 2023
8 points (100.0% liked)
Gaming
30547 readers
181 users here now
From video gaming to card games and stuff in between, if it's gaming you can probably discuss it here!
Please Note: Gaming memes are permitted to be posted on Meme Mondays, but will otherwise be removed in an effort to allow other discussions to take place.
See also Gaming's sister community Tabletop Gaming.
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I bought a 4090 just to run LLM and Stable Diffusion, with some occasional gaming. But if you're just use it for ML, get whatever is cheaper (ironically I found 4090 cheaper than 3090 when shopping around).
7900 XTX recently got support for Stable Diffusion and LLM, on paper, it's faster than 4090 RTX for FP16 computation, it does seem faster judging my experience using rented 4090 RTX on Runpod and my 7900 XTX GPU. 14 seconds (4090 RTX) vs 6 seconds (7900 XTX.)
7900 XTX is an option if you want $1000 cheaper than 4090 RTX and have similar sized VRAM and having comparable performance to that of 4090 RTX.
I'm doing summer research with a focus on ML. I just built my computer and picked AMD because of the price, but did not now that Nvidia was the one to pick at the moment if that's what I wanted it for. I don't know enough about hardware and could use the school labs anyway, but I should have done better research (ironic heh).