this post was submitted on 03 Oct 2023
215 points (95.4% liked)

Linux

48332 readers
370 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
 

if you could pick a standard format for a purpose what would it be and why?

e.g. flac for lossless audio because...

(yes you can add new categories)

summary:

  1. photos .jxl
  2. open domain image data .exr
  3. videos .av1
  4. lossless audio .flac
  5. lossy audio .opus
  6. subtitles srt/ass
  7. fonts .otf
  8. container mkv (doesnt contain .jxl)
  9. plain text utf-8 (many also say markup but disagree on the implementation)
  10. documents .odt
  11. archive files (this one is causing a bloodbath so i picked randomly) .tar.zst
  12. configuration files toml
  13. typesetting typst
  14. interchange format .ora
  15. models .gltf / .glb
  16. daw session files .dawproject
  17. otdr measurement results .xml
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 89 points 1 year ago* (last edited 1 year ago) (8 children)

This is the kind of thing i think about all the time so i have a few.

  • Archive files: .tar.zst
    • Produces better compression ratios than the DEFLATE compression algorithm (used by .zip and gzip/.gz) and does so faster.
    • By separating the jobs of archiving (.tar), compressing (.zst), and (if you so choose) encrypting (.gpg), .tar.zst follows the Unix philosophy of "Make each program do one thing well.".
    • .tar.xz is also very good and seems more popular (probably since it was released 6 years earlier in 2009), but, when tuned to it's maximum compression level, .tar.zst can achieve a compression ratio pretty close to LZMA (used by .tar.xz and .7z) and do it faster^1.

      zstd and xz trade blows in their compression ratio. Recompressing all packages to zstd with our options yields a total ~0.8% increase in package size on all of our packages combined, but the decompression time for all packages saw a ~1300% speedup.

  • Image files: JPEG XL/.jxl
    • "Why JPEG XL"
    • Free and open format.
    • Can handle lossy images, lossless images, images with transparency, images with layers, and animated images, giving it the potential of being a universal image format.
    • Much better quality and compression efficiency than current lossy and lossless image formats (.jpeg, .png, .gif).
    • Produces much smaller files for lossless images than AVIF^2
    • Supports much larger resolutions than AVIF's 9-megapixel limit (important for lossless images).
    • Supports up to 24-bit color depth, much more than AVIF's 12-bit color depth limit (which, to be fair, is probably good enough).
  • Videos (Codec): AV1
    • Free and open format.
    • Much more efficient than x264 (used by .mp4) and VP9^3.
  • Documents: OpenDocument / ODF / .odt

    it’s already a NATO standard for documents Because the Microsoft Word ones (.doc, .docx) are unusable outside the Microsoft Office ecosystem. I feel outraged every time I need to edit .docx file because it breaks the layout easily. And some older .doc files cannot even work with Microsoft Word.

[–] [email protected] 14 points 1 year ago* (last edited 1 year ago) (1 children)

.tar is pretty bad as it lacks in index, making it impossible to quickly seek around in the file. The compression on top adds another layer of complication. It might still work great as tape archiver, but for sending files around the Internet it is quite horrible. It's really just getting dragged around for cargo cult reasons, not because it's good at the job it is doing.

In general I find the archive situation a little annoying, as archives are largely completely unnecessary, that's what we have directories for. But directories don't exist as far as HTML is concerned and only single files can be downloaded easily. So everything has to get packed and unpacked again, for absolutely no reason. It's a job computers should handle transparently in the background, not an explicit user action.

Many file managers try to add support for .zip and allow you to go into them like it is a folder, but that abstraction is always quite leaky and never as smooth as it should be.

[–] [email protected] 6 points 1 year ago* (last edited 1 year ago)

.tar is pretty bad as it lacks in index, making it impossible to quickly seek around in the file.

.tar.pixz/.tpxz has an index and uses LZMA and permits for parallel compression/decompression (increasingly-important on modern processors).

https://github.com/vasi/pixz

It's packaged in Debian, and I assume other Linux distros.

Only downside is that GNU tar doesn't have a single-letter shortcut to use pixz as a compressor, the way it does "z" for gzip, "j" for bzip2, or "J" for xz (LZMA); gotta use the more-verbose "-Ipixz".

Also, while I don't recommend it, IIRC gzip has a limited range that the effects of compression can propagate, and so even if you aren't intentionally trying to provide random access, there is software that leverages this to hack in random access as well. I don't recall whether someone has rigged it up with tar and indexing, but I suppose if someone were specifically determined to use gzip, one could go that route.

[–] [email protected] 10 points 1 year ago (1 children)
  • By separating the jobs of archiving (.tar), compressing (.zst), and (if you so choose) encrypting (.gpg), .tar.zst follows the Unix philosophy of "Make each program do one thing well.".

wait so does it do all of those things?

[–] [email protected] 23 points 1 year ago

So there's a tool called tar that creates an archive (a .tar file. Then theres a tool called zstd that can be used to compress files, including .tar files, which then becomes a .tar.zst file. And then you can encrypt your .tar.zst file using a tool called gpg, which would leave you with an encrypted, compressed .tar.zst.gpg archive.

Now, most people aren't doing everything in the terminal, so the process for most people would be pretty much the same as creating a ZIP archive.

[–] [email protected] 9 points 1 year ago (1 children)

By separating the jobs of archiving (.tar), compressing (.zst), and (if you so choose) encrypting (.gpg), .tar.zst follows the Unix philosophy of “Make each program do one thing well.”.

The problem here being that GnuPG does nothing really well.

Videos (Codec): AV1

  • Much more efficient than x264 (used by .mp4) and VP9[3].

AV1 is also much younger than H264 (AV1 is a specification, x264 is an implementation), and only recently have software-encoders become somewhat viable; a more apt comparison would have been AV1 to HEVC, though the latter is also somewhat old nowadays but still a competitive codec. Unfortunately currently there aren't many options to use AV1 in a very meaningful way; you can encode your own media with it, but that's about it; you can stream to YouTube, but YouTube will recode to another codec.

[–] [email protected] 6 points 1 year ago (1 children)

The problem here being that GnuPG does nothing really well.

Could you elaborate? I've never had any issues with gpg before and curious what people are having issues with.

Unfortunately currently there aren’t many options to use AV1 in a very meaningful way; you can encode your own media with it, but that’s about it; you can stream to YouTube, but YouTube will recode to another codec.

AV1 has almost full browser support (iirc) and companies like YouTube, Netflix, and Meta have started moving over to AV1 from VP9 (since AV1 is the successor to VP9). But you're right, it's still working on adoption, but this is moreso just my dreamworld than it is a prediction for future standardization.

[–] [email protected] 4 points 1 year ago (2 children)

Could you elaborate? I’ve never had any issues with gpg before and curious what people are having issues with.

This article and the blog post linked within it summarize it very well.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago) (1 children)

Encrypting Email

Don’t. Email is insecure . Even with PGP, it’s default-plaintext, which means that even if you do everything right, some totally reasonable person you mail, doing totally reasonable things, will invariably CC the quoted plaintext of your encrypted message to someone else

Okay, provide me with an open standard that is widely-used that provides similar functionality.

It isn't there. There are parties who would like to move email users into their own little proprietary walled gardens, but not a replacement for email.

The guy is literally saying that encrypting email is unacceptable because it hasn't been built from the ground up to support encryption.

I mean, the PGP guys added PGP to an existing system because otherwise nobody would use their nifty new system. Hell, it's hard enough to get people to use PGP as it is. Saying "well, if everyone in the world just adopted a similar-but-new system that is more-amenable to encryption, that would be helpful", sure, but people aren't going to do that.

[–] [email protected] 2 points 1 year ago

The message to be taken from here is rather "don't bother", if you need secure communication use something else, if you're just using it so that Google can't read your mail it might be ok but don't expect this solution to be secure or anything. It's security theater for the reasons listed, but the threat model for some people is a powerful adversary who can spend millions on software to find something against you in your communication and controls at least a significant portion of the infrastructure your data travels through. Think about whistleblowers in oppressive regimes, it's absolutely crucial there that no information at all leaks. There's just no way to safely rely on mail + PGP for secure communication there, and if you're fine with your secrets leaking at one point or another, you didn't really need that felt security in the first place. But then again, you're just doing what the blog calls LARPing in the first place.

[–] [email protected] 5 points 1 year ago

.odt is simply a better standard than .docx.

No surprise, since OOXML is barely even a standard.

[–] [email protected] 3 points 1 year ago (1 children)
[–] [email protected] 18 points 1 year ago

AV1 can do lossy video as well as lossless video.

[–] [email protected] 3 points 1 year ago

I get better compression ratio with xz than zstd, both at highest. When building an Ubuntu squashFS

Zstd is way faster though

[–] [email protected] 2 points 1 year ago (1 children)

wait im confusrd whats the differenc ebetween .tar.zst and .tar.xz

[–] [email protected] 9 points 1 year ago (1 children)

Different ways of compressing the initial .tar archive.

[–] [email protected] 2 points 1 year ago

Damn didn't realize that JXL was such a big deal. That whole JPEG recompression actually seems pretty damn cool as well. There was some noise about GNOME starting to make use of JXL in their ecosystem too...