this post was submitted on 07 Oct 2024
357 points (96.1% liked)
Technology
59738 readers
2529 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Crazy how quickly NVIDIA went up. I wonder if they'll crash down just as fast should the AI hype either die off or shift to other manufacturers (Intel, AMD etc.) or in-house solutions (ex. Apple Intelligence).
I just want to get a graphics card for less than the rest of a rig combined... shits ridiculous, and AMD doesn't seem to be even trying to compete anymore
they do compete, its just users weigh DLSS and Raytracing far more than they should, and devalue VRam in long term situations
for example a 7900 GRE cost about the same as a 4070, but more people will buy the 4070 regardless
I definitely do like raytracing, sadly. I'm more interested in graphics and immersion in a setting/story in a game than competitiveness or ultra-high FPS. Water reflections and mirrors just look absolutely gorgeous to me.
I'm definitely strongly considering AMD regardless for my next build, as I'd like to switch to Linux fully at some point.
Eh, I got an AMD GPU somewhat recently and it meets all my expectations. I'm not too interested in RTX or compute, and they have a really good value on raster performance.
I'm coping for RDNA4.
They said the same when the crypto hype came along. If AI dies off there will be other trends in computing that require cutting edge silicon. AI may or may not continue surging but hardware will be needed no matter what. NVIDIA is selling shovels, not panning for gold.
Apple is not there yet, its models were trained on Google hardware. Though I am surprised it wasn't Nvidia hardware.
What's "Google hardware"? Likely just NVIDIA hardware running in Google's cloud?
No no, Google does actually have its own custom proprietary AI hardware - https://en.wikipedia.org/wiki/Tensor_processing_unit
Ah, TIL.