this post was submitted on 04 Feb 2024
213 points (87.9% liked)
Technology
59107 readers
5344 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I mean there are advantages to using AV1 for photos... Hardware accelerated decoding being one.
Decoding a large AVIF image grid should in theory work on a GPU and happen faster with less power than any software based image format implementation.
AV1 is also just an awesome format that's entirely free to use out of the gate.
Well yes, however without acceleration JPEG XL is many times faster. Also if you only have a CPU for example.
It's also highly parallelizable compared to AVIF which also matters a lot considering the amount of cores is growing with the likes of ARM and hybrid architecture CPU.
AVIF also fairs badly with high fidelity and lossless encoding, has 1/3 the bit depth and pretty small dimension limits for something like photography.
I don't think AVIF is per se a bad format. I just think if I want to replace a photo oriented format I'd like to do that with one that's focused on „good“ photos and not just an afterthought with up- and downsides.
I thought even mobile-tier integrated GPUs can decode AV1 extremely quickly.
Well yes sure, but remember AV1 decoding only became standard like 1-2 GPU generations ago. Encoding only this generation. iPhones only got support with the 15 Pro so it will be another generation before it trickles down to the base models. And what about the hundreds of millions of Android phones in Asia and the likes with dirt cheap SoCs. Pretty sure they don't have dedicated AV1 decoding hardware for a long time.
So that's a TON of hardware being made slow and inefficient if everything were to be AVIF tomorrow. Not saying AVIF decoding will be a big hurdle in the future but how long until all this hardware browsing the web has been replaced? That's why I think somethings that's efficient and fast on CPUs without any specialised hardware is more suited for a replacement.
Servers often come without GPU, and they’re usually the ones encoding image formats.
I don’t think we should worry about servers meant for image transcoding not having the proper hardware for image transcoding. The problem with the GPU requirement starts and ends with consumer devices imo
Isn't AV1 exclusively for video recoding? I haven't heard of it being used for photos.
https://en.wikipedia.org/wiki/AVIF