this post was submitted on 11 Feb 2024
50 points (98.1% liked)

Games

16213 readers
659 users here now

Video game news oriented community. No NanoUFO is not a bot :)

Posts.

  1. News oriented content (general reviews, previews or retrospectives allowed).
  2. Broad discussion posts (preferably not only about a specific game).
  3. No humor/memes etc..
  4. No affiliate links
  5. No advertising.
  6. No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
  7. No self promotion.
  8. No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
  9. No politics.

Comments.

  1. No personal attacks.
  2. Obey instance rules.
  3. No low effort comments(one or two words, emoji etc..)
  4. Please use spoiler tags for spoilers.

My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.

Other communities:

Beehaw.org gaming

Lemmy.ml gaming

lemmy.ca pcgaming

founded 1 year ago
MODERATORS
 

cross-posted from: https://lemmy.world/post/11840660

TAA is a crucial tool for developers - but is the impact to image quality too great?

For good or bad, temporal anti-aliasing - or TAA - has become a defining element of image quality in today's games, but is it a blessing, a curse, or both? Whichever way you slice it, it's here to stay, so what is it, why do so many games use it and what's with all the blur? At one point, TAA did not exist at all, so what methods of anti-aliasing were used and why aren't they used any more?

top 24 comments
sorted by: hot top controversial new old
[–] [email protected] 11 points 6 months ago

TAA has become so common because its’s “free”. Temporal data is required by DLSS and FSR, so if you are implementing those technologies you already have the necessary data to implement TAA, making it a no brainier to include.

[–] [email protected] 5 points 6 months ago (3 children)

Antialiasing is a byproduct of moving away from CRT display technology. The natural image softening in CRT tech is not replicated in LCD and LED displays.

TAA is one of the better options, but at the end of the day it will be difficult to create a true AA solution that doesnt have artifacts, without utilizing supersampling.

[–] [email protected] 6 points 6 months ago (1 children)

We used AA on our CRTs back in the day. Of course we were all running like 1024x768 as the resolution so it was a lot more needed. The higher your resolution the less you need it.

[–] [email protected] 2 points 6 months ago

Yes, thats true. AA was helpful at certain resolutions that were what I call "medium resolutions", the range between 480 and 768 pixels. But CRTs still had a softer image simply as a byproduct of the way the technology worked, and worked better at lower resolutions like 240p (AFAIK, any signal less than 480 vertical pixel resolution was automatically progressive scan). This was abused and exploited by game developers of the time, famously utilizing dithering for transparency effects for platforms that didn't fully support it such as the SEGA Saturn (it only supported transparent 2D sprites, but not textured polygons like the PSX did). The softer image led to the dithered effects smoothing out, giving the appearance of a bigger available color palette and special effects. Flickering sprites every other field was also a common technique due to CRTs high image persistence. This is why games like Streets of Rage look awful on modern displays, but display correctly on CRTs.

But regardless, AA will probably be phased out eventually, its just a tool to mitigate growing pains of new display technology.

[–] [email protected] 2 points 6 months ago (1 children)

Interesting take. Do you think that natural image softening would come back in newer technologies?

[–] [email protected] 4 points 6 months ago (1 children)

I'm not that guy, but I don't think so. The trend will likely be that we get to the point where we render and display in such a high resolution that you can't even see pixels anymore. We're getting there already with smaller 4k displays where turning on AA doesn't have an appreciable difference in 4k native rendering.

[–] [email protected] 1 points 6 months ago* (last edited 6 months ago)

I agree with this. Outside of some media that may release with special effects designed to mimic the softer image of a CRT, I think display technology will just progress to the point where nothing will use AA at all because the resolution is just too high to really tell. I mean, its already like that with 4k TVs, you sit far away enough that you usually can't tell the difference between 4k and 1080p.

[–] [email protected] 1 points 6 months ago

DLAA comes to mind

[–] [email protected] 5 points 6 months ago (5 children)

The first things I always turn off are motion blur, anti-aliasing and ray tracing.

Motion blur just makes it look like you're drunk, anti-aliasing makes everything look like it's smeared with vaseline and ray tracing tanks your FPS for not much added quality.

[–] [email protected] 10 points 6 months ago

I don’t think I could stomach a game without AA. It’s on par with playing a game with an unstable 30fps frame rate, it’s just nauseating.

[–] [email protected] 4 points 6 months ago (1 children)

Try playing Forza without AA. Ray Tracing tanks your performance, but it gives great visual Enhancements, once you experience it, there's no going back.

[–] [email protected] -4 points 6 months ago (1 children)

I don't really play racing games or Forza so maybe it's unique to Forza or racing in general but every RPG, action, adventure, strategy, survival, shooter and sim game I have played looks worse with AA and ray tracing is not worth cutting your FPS in half for.

[–] [email protected] 3 points 6 months ago (1 children)

You must not notice aliasing and shimmering then? Most find it very distracting to see everything flickering and shimmering and stair step with the slightest motion.

And ray tracing really depends on the game, implementation, and hardware. Ray traced global illumination alone fixes the classic video game look that stems from rasterized lighting errors (light leaking, default ambient light, etc). It is the future for high quality games even not photo-realistic ones. Its expense is offset by both reconstruction and improved hardware. You wont be able to avoid it forever even if you want to.

[–] [email protected] 1 points 6 months ago (1 children)

It has gotten much better in the last 7 years. I will say that I usually test 1.5× or 2× my resolution if possible, which can to be less taxing depending on the engine, as I'm always trying to eek out a little extra on my 970.

[–] [email protected] 1 points 6 months ago (1 children)

2x on a 970? I struggled with my 970 at 1440p low-medium settings until i got the 3080. Often had to put scaling to 1080p. And that was on "last gen" titles, cant imagine still trying to limp that thing along nowadays, despite as much as i loved it.

[–] [email protected] 1 points 6 months ago

Depends on the game, but I don't usually pick up current gen for a bit. Unless you count Switch Emulation?

[–] [email protected] 3 points 6 months ago

Motion blur just makes it look like you're drunk

Someone hasn't tried motion blur since 2004 GTA

[–] [email protected] 2 points 6 months ago (1 children)

Same. I also disable stuff like filmgrain and lens-flares, whenever possible.

[–] [email protected] 2 points 6 months ago

I always have film grain enabled. It provides some half decent dithering that helps mask color banding, especially noticeable on my low end monitor.

[–] [email protected] 0 points 6 months ago (1 children)

It's like you've used each thing once in some specific game where it was badly implemented and decided that's how it looks in all games.

There is no objective "it looks like this", every game does things slightly or very differently. I'm certain you are unusually blind to detail, have serious vision problems, or you're just very good at convincing yourself of your own bad ideas.

[–] [email protected] 1 points 6 months ago

There are actually a few unreal engine games where you can't disable AA in the settings and I have tried to play with it on but I just end up disabling it in the ini files anyways because it looks bad. I have not encountered AA that does not make the game look blurry.

I have never met anyone who doesn't disable motion blur just outright so didn't think anyone would ever defend that.

[–] [email protected] 3 points 6 months ago* (last edited 6 months ago)

At one point

I was there, 3000 years ago, when first of the consumer AA almost usable on my Voodoo 1.

[–] [email protected] 3 points 6 months ago

TAA just makes beautiful graphics look crappy and blurry.

rather not have AA at all instead

[–] [email protected] 0 points 6 months ago

All temporal effects kinda blow.