this post was submitted on 18 Aug 2023
275 points (100.0% liked)

Gaming

30579 readers
132 users here now

From video gaming to card games and stuff in between, if it's gaming you can probably discuss it here!

Please Note: Gaming memes are permitted to be posted on Meme Mondays, but will otherwise be removed in an effort to allow other discussions to take place.

See also Gaming's sister community Tabletop Gaming.


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

I am probably unqualified to speak about this, as I am using an RX 550 low profile and a 768P monitor and almost never play newer titles, but I want to kickstart a discussion, so hear me out.

The push for more realistic graphics was ongoing for longer than most of us can remember, and it made sense for most of its lifespan, as anyone who looked at an older game can confirm - I am a person who has fun making fun of weird looking 3D people.

But I feel games' graphics have reached the point of diminishing returns, AAA studios of today spend millions of dollars just to match the graphics' level of their previous titles - often sacrificing other, more important things on the way, and that people are unnecessarily spending lots of money on electricity consuming heat generating GPUs.

I understand getting an expensive GPU for high resolution, high refresh rate gaming but for 1080P? you shouldn't need anything more powerful than a 1080 TI for years. I think game studios should just slow down their graphical improvements, as they are unnecessary - in my opinion - and just prevent people with lower end systems from enjoying games, and who knows, maybe we will start seeing 50 watt gaming GPUs being viable and capable of running games at medium/high settings, going for cheap - even iGPUs render good graphics now.

TLDR: why pay for more and hurt the environment with higher power consumption when what we have is enough - and possibly overkill.

Note: it would be insane of me to claim that there is not a big difference between both pictures - Tomb Raider 2013 Vs Shadow of the Tomb raider 2018 - but can you really call either of them bad, especially the right picture (5 years old)?

Note 2: this is not much more that a discussion starter that is unlikely to evolve into something larger.

(page 2) 50 comments
sorted by: hot top controversial new old
[–] [email protected] 7 points 1 year ago* (last edited 1 year ago) (2 children)

I kind of stopped caring about graphics at the end of the PS3 days. That was the last time were graphics made a substantial gameplay difference, namely it allowed bigger environments and more verticality (e.g. Assassins Creed). The 10 years after that it just turned into a wash, the shaders got better, we got PBR, subsurface scattering and all the jazz, but none of it really mattered. Something like Dead Space or Red Faction: Guerrilla still has more interesting interaction with enemies and environments than most modern games.

I care about graphics only in so far that you need to have enough of it to render the interactive elements of your game. Past that it's just fluff or even detrimental, as uber realstic graphics often lead to a much harder to read game environment, as it's no longer clear what you can interact with and what not (and ginormous floating hologram icons ain't exactly a good solution here either).

The one area that still needs powerful graphics is Virtual Reality. The screen covering a much bigger FOV than a monitor game requires a ton more pixels to push around, as well as higher refresh rate (~4k@90fps is were it starts getting acceptable). However, VR so far still hasn't gained any real traction in the gaming space and ~~Facebook~~ Meta trying monopolize it ain't helping either. This might take quite a few more years before it becomes relevant.

These days my graphics card is mostly running StableDiffusion for AI image generation. While my gaming is mostly just 16bit games or indie stuff that could run on a 10 or 20 year old PC just fine.

[–] [email protected] 6 points 1 year ago

Thank you! The strength of video games as a medium is interactivity. If it doesn't enhance my ability to exist and act in a virtual space it's essentially worthless to me. I want better physics, better AI, and more interesting mechanics, not better anti-aliasing and effects that mistake eyeballs for camera lenses.

load more comments (1 replies)
[–] [email protected] 7 points 1 year ago

Some great games make use of that increased power. Many more do not.

maybe we will start seeing 50 watt gaming GPUs being viable and capable of running games at medium/high settings, going for cheap

The Steam Deck runs lots of games at those settings at about 15W, more or less, depending on how you count, and at lower resolutions too. Strides are being made toward this end of better performance per watt, especially in ARM architectures, which is a step that Apple arguably took too soon, as it relates to gaming.

[–] [email protected] 7 points 1 year ago

It depends on the type of game for me. For (competitive) multiplayer games, I don't really care much. As long as it looks coherent and artistically well done, I'm fine with it.

For single player games, especially story-based games, I like when there's a lot of emphasis on graphical fidelity. Take The Last of Us: Part II for example. To me, especially in the (real-time) cutscenes, the seemingly great synergy between artists and engine developers really pays off. You can visually see how a character feels, emotions and tensions come across so well. Keep in mind they managed to do this on aging PS4 hardware. They didn't do anything revolutionary per se with this game in terms of graphical fidelity (no fancy RT or whatever), but they just combined everything so well. Now, would I have enjoyed the game if it looked significantly worse? Probably yes, but I have to say that the looks of the game likely made me enjoy it more. Low resolution textures, shadows or unrealistic facial impressions would've taken away from the immersion for me. Now, some would say that the gameplay of TLoU:II was rather bland because it didn't add or change a lot over TLoU (1), but for me, bringing the story across in this very precise way was what made it a great game (people will have other opinions, but that's fine).

I agree with you on the power consumption part though. Having GPUs consuming north of 400 watts while playing a game is insane. I have a 3080 and it's what, 340 watts TDP? In reality it consumes around 320 watts or whatever under full load, but that's a lot of power for playing games. Now, this generation GPUs are a lot more efficient in theory (at least on the Nvidia side, a 4070 uses 100-150 watts less to achieve the same output as a 3080), which is good.

But there's a trend in recent generations where manufacturers set their GPUs (and CPUs) to be way beyond their best power/performance ratio, just to "win" over the competition in benchmarks/reviews. Sure you can tweak it, but in my opinion, it should be the other way around. Make the GPUs efficient by default and give people headroom to overclock with crazy power budgets if they choose to.

I remember when AMD released the FX-9590 back in 2013 and they got absolutely laughed at because it had a TDP of 220 watts (I know, TDP != actual power consumption, but it was around 220). Nowadays, a 13900K consumes 220 watts out of the box no problem and then some, and people are like "bUt LoOk At ThE cInEbEnCh ScOrE!!111!111". Again, you can tweak it, but out of the box it sucks power like nobody's business.

This needs to improve again. Gaming GPUs should cap out at 250 watts at the high-end, and CPUs at like 95 watts, with mainstream GPUs targeting around 150 watts and CPUs probably around 65 watts.

[–] [email protected] 6 points 1 year ago (3 children)

Ray Tracing makes a huge difference regardless of everything else.

Games are still fun at low quality (I mostly play on steam deck), but we haven't reached diminishing returns until every game is fully ray traced for all lighting including special effects.

load more comments (3 replies)
[–] [email protected] 6 points 1 year ago (1 children)

So I'm an environmental criminal just because I enjoy graphics and have an expensive hobby?

load more comments (1 replies)
[–] [email protected] 6 points 1 year ago

you shouldn’t need anything more powerful than a 1080 TI for years

This seems true to me. 1080 Ti runs everything today & likely going to be okay for 3 years or more. Though the 1080 Ti is a very powerful card..

I think a 1060 3g should be where the line is drawn, absolutely trash looking textures that some games have on minimum graphics shouldn't use more than 2 gigs of vram. (yes I'm looking at you BG3)

[–] [email protected] 5 points 1 year ago (1 children)

RX550+768p gang, rise up!🙌

load more comments (1 replies)
[–] [email protected] 5 points 1 year ago (4 children)

I'm with you. I've wanted more players, larger maps, and less emphasis on graphics (cause they've been pretty good) for awhile now. I think back on games that were fun and addictive: they weren't pretty but blocky. Yet it didn't matter.

Contemporary graphics are amazing, I admit. I also find it difficult to put a release date on any games from the last 5-8 years. They look pretty similar in graphics to my eye.

Am I wrong for being willing to sacrifice some graphic sophistication for the sake of say a 128v128 game in a destructible environment?

load more comments (4 replies)
[–] [email protected] 5 points 1 year ago

As far as people with lower-end systems being unable to enjoy newer titles, I'd like to point you to the prolific and endlessly creative indie scene. Yes, you may not be able to play the latest AAA "Skyrim Ultra Remaster DX: Penultimate Boogaloo - GotY Edition," but there's lots of gems out there that don't require the latest GPUs.

But as for realistic graphics, there's actually an aspect you may buy be aware of: game companies actually influence the movie industry. Because games not only have to deliver realism but realism in real time, they have to constantly invent new techniques. The movie industry picks up these techniques and makes better movies.

Plus, IMO, improvements are always welcome. Death Stranding had some of the best mocap I've seen in both the movie and videogame industries, and don't feel those improvements were wasted effort.

[–] [email protected] 5 points 1 year ago (1 children)

I buy games based on the following tier scale:

Gameplay > Performance > Price > Expected time playing > Graphics

I agree with your point in the post, especially after playing Darktide, which chucked performance out the window for fog and lighting effects. It doesn't matter how pretty your game is if it's rendering at 3fps.

load more comments (1 replies)
[–] [email protected] 4 points 1 year ago (1 children)

Development has always been incremental, but as gaming engines get better, they start being used for more and more things. AAA games don't always develop the graphics and game engine from scratch, most often they develop one technology and use it for many games, or even buy a pre-built engine and "just" build the game around it.

Unreal Engine has been used not just for games, but also for real time, or near real time film making. The same engine that is used for playing a game can be used to create the background effects in TV shows, or whole scenes.

It's crazy to suggest we just stop working on something because it's good enough, because that's not what people do.

[–] [email protected] 5 points 1 year ago (1 children)

It’s crazy to suggest we just stop working on something because it’s good enough, because that’s not what people do.

Came here to say this, glad it's already been posted.

Also, why is it that every time someone is being critical of advancements in "realistic graphics" they always post screenshots of Lara Croft?

[–] [email protected] 4 points 1 year ago (1 children)

Lara Croft has been around since the triangle boobs. There aren't too many other characters that have been in 3D as long as Lara Croft (Mario 64 Was released the same year, but Mario hasn't come as far as Lara Croft has in terms of photorealism). Plus, she's instantly recognizable. Personally, I don't think there's any deeper reason than that.

load more comments (1 replies)
[–] [email protected] 4 points 1 year ago (1 children)

My stance on graphics is that the realistic graphics of now compared to a game like Fallout New Vegas is that, to me, they don't look very noticeably better unless I actively pay attention to them (which I don't do whenever I play a game).

Then again, I usually don't play triple AAA studio games anymore, so my perception is skewed by either older games or ones with a lot more stylized graphics.

[–] [email protected] 7 points 1 year ago

Triple AAA

Ah yes, AAAAAAAAA

load more comments
view more: ‹ prev next ›