this post was submitted on 18 Aug 2023
275 points (100.0% liked)

Gaming

30579 readers
132 users here now

From video gaming to card games and stuff in between, if it's gaming you can probably discuss it here!

Please Note: Gaming memes are permitted to be posted on Meme Mondays, but will otherwise be removed in an effort to allow other discussions to take place.

See also Gaming's sister community Tabletop Gaming.


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

I am probably unqualified to speak about this, as I am using an RX 550 low profile and a 768P monitor and almost never play newer titles, but I want to kickstart a discussion, so hear me out.

The push for more realistic graphics was ongoing for longer than most of us can remember, and it made sense for most of its lifespan, as anyone who looked at an older game can confirm - I am a person who has fun making fun of weird looking 3D people.

But I feel games' graphics have reached the point of diminishing returns, AAA studios of today spend millions of dollars just to match the graphics' level of their previous titles - often sacrificing other, more important things on the way, and that people are unnecessarily spending lots of money on electricity consuming heat generating GPUs.

I understand getting an expensive GPU for high resolution, high refresh rate gaming but for 1080P? you shouldn't need anything more powerful than a 1080 TI for years. I think game studios should just slow down their graphical improvements, as they are unnecessary - in my opinion - and just prevent people with lower end systems from enjoying games, and who knows, maybe we will start seeing 50 watt gaming GPUs being viable and capable of running games at medium/high settings, going for cheap - even iGPUs render good graphics now.

TLDR: why pay for more and hurt the environment with higher power consumption when what we have is enough - and possibly overkill.

Note: it would be insane of me to claim that there is not a big difference between both pictures - Tomb Raider 2013 Vs Shadow of the Tomb raider 2018 - but can you really call either of them bad, especially the right picture (5 years old)?

Note 2: this is not much more that a discussion starter that is unlikely to evolve into something larger.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 26 points 1 year ago (2 children)

I understand the sentiment, but it seems like you're drawing arbitrary lines in the sand for what is the "correct" amount of power for gaming. Why waste 50 watts of GPU (or more like 150 total system watts) on a game that something like a SteamDeck will draw 15watts to do almost identically. 10 times less power for definitely not 10 times less fidelity. We could all the way back to the original Gameboy for 0.7 watts, the fidelity drops but so does the power. What is the "correct" wattage?

I agree that the top end gpus are shit at efficiency and we should could cut back. But I don't agree that fidelity and realism should stop advancing. Some type of efficiency requirement would be nice, but every year games should get more advanced and every year gpus should get better (and hopefully stay efficient).

[–] [email protected] 2 points 1 year ago

I agree that I shouldn't have set the arbitrary 50 watt thing, I just saw my GPU and bigger ones and came out with that number.

[–] [email protected] 2 points 1 year ago (1 children)

I agree that the top end gpus are shit at efficiency and we should could cut back.

According to Steam survey, 4090, 3090, 6900XT, and 7900 XTX combined are being used by about 1.7% of gamers.

This number is, of course, inflated (at least slightly) because people who have money to buy these cards are also more likely to buy games and people owning older/cheaper cards are more likely to be playing pirated copies.

The top tier cards are showcase of technological advancement. They are not really used by a large number of people. So there's not much point. It will only reduce the baseline for next generation, leading to less advancement.

[–] [email protected] 2 points 1 year ago

That's a very good point, but a little misleading. A better number would be to add up all the top tier cards from every generation, not just the past 2. Just because they're old doesn't mean they still aren't relatively inefficient for their generation.

If we kept the generations exactly the same, but got rid of the top 1 or 2 cards. The technological advancement would be happening just as fast. Because really, the top tier cards are about silicon lottery and putting as much power in while keeping stable clocks. They aren't different from an architecture perspective within the same generation. It's about being able to sell the best silicon and more VRAM at a premium.

But as you said, it's still a drop in the bucket compared to the overall market.