23
Hexbears, do you prefer the series x or ps5.
(hexbear.net)
Chat is a text only community for casual conversation, please keep shitposting to the absolute minimum. This is intended to be a separate space from c/chapotraphouse or the daily megathread. Chat does this by being a long-form community where topics will remain from day to day unlike the megathread, and it is distinct from c/chapotraphouse in that we ask you to engage in this community in a genuine way. Please keep shitposting, bits, and irony to a minimum.
As with all communities posts need to abide by the code of conduct, additionally moderators will remove any posts or comments deemed to be inappropriate.
Thank you and happy chatting!
No, some of them have the 780M, so do look out for them. The 680M is kinda good, but I'd ignore that for being old.
The Strix Point may be what you're looking for. However the game you've mentioned here - Alan Wake 2 - struggles with even the RTX 3000 and 4000 series GPU - just wanted to let you know that this game is terribly optimized.
दुःख, दर्द, पीड़ा |
Yeah it's also a dogshit game, this much is true of Callisto Protocol as well, but I use their numbers as a metric for future proofing, kinda. If a GPU can't run this now, how bad is it gonna be in a year?
It's kind of a contradiction, where everyone is yelling about how you need 16GB VRAM and a fast GPU, and you do to run stuff... but almost none of the big games are worth it
pointless
You need to give some love to games like Hollow Knight, Noita and Triangle Strategy. The triple-A game scene will be getting worse soon.
This is what happens when investors are your real customers. Sure as well hope that TES6 be cancelled. Because it will suck anyway, now that we've seen how their space game was absolute garbage.
Anyways, if you remember the shared memory tech used in Xbox and older PlayStations, which made use of the AMD processors, they used to have shared RAM for CPU and GPU. Those were primitive tech, and limited to atmost 2 or 4GB. I've heard that AMD is bringing that for PCs, desktop and laptop alike, so the performance gains may be really phenomenal. This one is be called the Strix Halo, aka the Medusa Point, and right now, it is being tested for cache and RAM sharing (I think, I may be wrong tho) - it's called the Infinity Fabric or something.
But you'll have to wait till 2025.
Metroidvania is cringe, but I'm literally playing Celeste right now, having just finished Tactics Ogre, and then I'm gonna play Fata Morgana. I agree, but if that's your bag then you don't need to consider any of this. A GTX 1050Ti will suffice for that, the point of stressing about big games running is future proofing. The last AAA game I bought was Doom Eternal, so I'm not that fussed, was only curious about APUs as a console alternative.
Yeah Bethesda's last good game was before the 2008 recession, I would laugh if TES6 got canned. Skyrim was fuuuckin shiiit!
I would be hesitant of getting hyped for any memory sharing stuff AMD talks about--back when they were still doing Modules with shared elements in Bulldozer and Piledriver architectures, their APU line featured a bit of this functionality with the Mantle API - Heterogenous System Architecture, or HSA for short. AMD talked a lot of shit about how cool it was for the processor and graphics parts of the APU to access the same memory, and how cool it was that they would be treated "equally" computationally. They dropped this idea hardcore when Ryzen launched because it was basically a GPU-shaped crutch for their garbage CPUs, and the marketing talk was all cope, lol