this post was submitted on 03 Oct 2023
215 points (95.4% liked)

Linux

48332 readers
370 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
 

if you could pick a standard format for a purpose what would it be and why?

e.g. flac for lossless audio because...

(yes you can add new categories)

summary:

  1. photos .jxl
  2. open domain image data .exr
  3. videos .av1
  4. lossless audio .flac
  5. lossy audio .opus
  6. subtitles srt/ass
  7. fonts .otf
  8. container mkv (doesnt contain .jxl)
  9. plain text utf-8 (many also say markup but disagree on the implementation)
  10. documents .odt
  11. archive files (this one is causing a bloodbath so i picked randomly) .tar.zst
  12. configuration files toml
  13. typesetting typst
  14. interchange format .ora
  15. models .gltf / .glb
  16. daw session files .dawproject
  17. otdr measurement results .xml
top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 132 points 1 year ago (3 children)

Just going to leave this xkcd comic here.

Yes, you already know what it is.

[–] [email protected] 26 points 1 year ago (1 children)

One could say it is the standard comic for these kinds of discussions.

load more comments (1 replies)
load more comments (2 replies)
[–] [email protected] 112 points 1 year ago (4 children)

Open Document Standard (.odt) for all documents. In all public institutions (it's already a NATO standard for documents).

Because the Microsoft Word ones (.doc, .docx) are unusable outside the Microsoft Office ecosystem. I feel outraged every time I need to edit .docx file because it breaks the layout easily. And some older .doc files cannot even work with Microsoft Word.

Actually, IMHO, there should be some better alternative to .odt as well. Something more out of a declarative/scripted fashion like LaTeX but still WYSIWYG. LaTeX (and XeTeX, for my use cases) is too messy for me to work with, especially when a package is Byzantine. And it can be non-reproducible if I share/reuse the same document somewhere else.

Something has to be made with document files.

[–] [email protected] 21 points 1 year ago (1 children)

Markdown, asciidoc, restructuredtext are kinda like simple alternatives to LaTeX

[–] [email protected] 17 points 1 year ago (2 children)

It is unbelievable we do not have standard document format.

[–] [email protected] 14 points 1 year ago

What's messed up is that, technically, we do. Originally, OpenDocument was the ISO standard document format. But then, baffling everyone, Microsoft got the ISO to also have .docx as an ISO standard. So now we have 2 competing document standards, the second of which is simply worse.

load more comments (1 replies)
[–] [email protected] 15 points 1 year ago (5 children)

I was too young to use it in any serious context, but I kinda dig how WordPerfect does formatting. It is hidden by default, but you can show them and manipulate them as needed.

It might already be a thing, but I am imagining a LaTeX-based standard for document formatting would do well with a WYSIWYG editor that would hide the complexity by default, but is available for those who need to manipulate it.

load more comments (5 replies)
load more comments (1 replies)
[–] [email protected] 89 points 1 year ago* (last edited 1 year ago) (5 children)

zip or 7z for compressed archives. I hate that for some reason rar has become the defacto standard for piracy. It's just so bad.

The other day I saw a tar.gz containing a multipart-rar which contained an iso which contained a compressed bin file with an exe to decompress it. Soooo unnecessary.

Edit: And the decompressed game of course has all of its compressed assets in renamed zip files.

[–] [email protected] 51 points 1 year ago (1 children)

A .tarducken, if you will.

load more comments (1 replies)
[–] [email protected] 35 points 1 year ago (1 children)

It was originally rar because it’s so easy to separate into multiple files. Now you can do that in other formats, but the legacy has stuck.

load more comments (1 replies)
[–] [email protected] 18 points 1 year ago (8 children)

.tar.zstd all the way IMO. I've almost entirely switched to archiving with zstd, it's a fantastic format.

load more comments (8 replies)
load more comments (2 replies)
[–] [email protected] 89 points 1 year ago* (last edited 1 year ago) (38 children)

This is the kind of thing i think about all the time so i have a few.

  • Archive files: .tar.zst
    • Produces better compression ratios than the DEFLATE compression algorithm (used by .zip and gzip/.gz) and does so faster.
    • By separating the jobs of archiving (.tar), compressing (.zst), and (if you so choose) encrypting (.gpg), .tar.zst follows the Unix philosophy of "Make each program do one thing well.".
    • .tar.xz is also very good and seems more popular (probably since it was released 6 years earlier in 2009), but, when tuned to it's maximum compression level, .tar.zst can achieve a compression ratio pretty close to LZMA (used by .tar.xz and .7z) and do it faster^1.

      zstd and xz trade blows in their compression ratio. Recompressing all packages to zstd with our options yields a total ~0.8% increase in package size on all of our packages combined, but the decompression time for all packages saw a ~1300% speedup.

  • Image files: JPEG XL/.jxl
    • "Why JPEG XL"
    • Free and open format.
    • Can handle lossy images, lossless images, images with transparency, images with layers, and animated images, giving it the potential of being a universal image format.
    • Much better quality and compression efficiency than current lossy and lossless image formats (.jpeg, .png, .gif).
    • Produces much smaller files for lossless images than AVIF^2
    • Supports much larger resolutions than AVIF's 9-megapixel limit (important for lossless images).
    • Supports up to 24-bit color depth, much more than AVIF's 12-bit color depth limit (which, to be fair, is probably good enough).
  • Videos (Codec): AV1
    • Free and open format.
    • Much more efficient than x264 (used by .mp4) and VP9^3.
  • Documents: OpenDocument / ODF / .odt

    it’s already a NATO standard for documents Because the Microsoft Word ones (.doc, .docx) are unusable outside the Microsoft Office ecosystem. I feel outraged every time I need to edit .docx file because it breaks the layout easily. And some older .doc files cannot even work with Microsoft Word.

load more comments (38 replies)
[–] [email protected] 48 points 1 year ago (35 children)

Ogg Opus for all lossy audio compression (mp3 needs to die)

7z or tar.zst for general purpose compression (zip and rar need to die)

[–] [email protected] 23 points 1 year ago (5 children)

The existence of zip, and especially rar files, actually hurts me. It's slow, it's insecure, and the compression is from the jurassic era. We can do better

load more comments (5 replies)
load more comments (34 replies)
[–] [email protected] 46 points 1 year ago (4 children)

I don't know what to pick, but something else than PDF for the task of transferring documents between multiple systems. And yes, I know, PDF has it's strengths and there's a reason why it's so widely used, but it doesn't mean I have to like it.

Additionally all proprietary formats, specially ones who have gained enough users so that they're treated like a standard or requirement if you want to work with X.

[–] [email protected] 16 points 1 year ago

oh it's x, not x... i hate our timeline

load more comments (3 replies)
[–] [email protected] 46 points 1 year ago (8 children)
[–] [email protected] 23 points 1 year ago* (last edited 1 year ago)

I agree.

I especially love that it addresses the biggest pitfall of the typical "fancy new format does things better than the one we're already using" transition, in that it's specifically engineered to make migration easier, by allowing a lossless conversion from the dominant format.

load more comments (7 replies)
[–] [email protected] 46 points 1 year ago (11 children)

Literally any file format except PDF for documents that need to be edited. Fuck Adobe and fuck Acrobat

[–] [email protected] 18 points 1 year ago (10 children)

Isn't the point of PDF that it can't (or, perhaps more accurately, shouldn't) be edited after the fact? It's supposed to be immutable.

load more comments (10 replies)
load more comments (10 replies)
[–] [email protected] 44 points 1 year ago (1 children)

Resume information. There have been several attempts, but none have become an accepted standard.

When I was a consultant, this was the one standard I longed for the most. A data file where I could put all of my information, and then filter and format it for each application. But ultimately, I wanted to be able to submit the information in a standardised format - without having to re-enter it endlessly into crappy web forms.

I think things have gotten better today, but at the cost of a reliance on a monopoly (LinkedIn). And I'm not still in that sort of job market. But I think that desire was so strong it'll last me until I'm in my grave.

load more comments (1 replies)
[–] [email protected] 39 points 1 year ago (5 children)

SQLite for all “I’m going to write my own binary format because I is haxor” jobs.

There are some specific cases where SQLite isn’t appropriate (streaming). But broadly it fits in 99% of cases.

[–] [email protected] 14 points 1 year ago* (last edited 1 year ago) (1 children)

To chase this - converting to json or another standardized format in every single case where someone is tempted to write their own custom parser. Never write custom parsers kids, they're an absolutely horrible time-suck and you'll be fixing them quite literally forever as you discover new and interesting ways for your input data to break them.

Edit: it doesn't have to be json, I really don't care what format you use, just pick an existing data format that uses a robust, thoroughly tested, parser.

load more comments (1 replies)
load more comments (4 replies)
[–] [email protected] 33 points 1 year ago* (last edited 1 year ago) (7 children)

I wish there was a more standardized open format for documents. And more people and software should use markdown/.md because you just don't need anything fancier for most types of documents.

load more comments (7 replies)
[–] [email protected] 30 points 1 year ago (2 children)

Data output from manufacturing equipment. Just pick a standard. JSON works. TOML / YAML if you need to write as you go. Stop creating your own format that’s 80% JSON anyways.

load more comments (2 replies)
[–] [email protected] 28 points 1 year ago (3 children)

I don't give a shit which debugging format any platform picks, but if they could each pick one that every emulator reads and every compiler emits, that'd be fucking great.

[–] [email protected] 14 points 1 year ago* (last edited 1 year ago) (9 children)

Even more simpler, I'd really like if we could just unify whether or not $ is needed for variables, and pick # or // for comments. I'm sick of breaking my brain when I flip between languages because of these stupid nuance inconsistencies.

load more comments (9 replies)
load more comments (2 replies)
[–] [email protected] 27 points 1 year ago* (last edited 1 year ago) (6 children)

I'd setup a working group to invent something new. Many of our current formats are stuck in the past, e.g. PDF or ODF are still emulating paper, even so everybody keeps reading them on a screen. What I want to see is a standard document format that is build for the modern day Internet, with editing and publishing in mind. HTML ain't it, as that can't handle editing well or long form documents, EPUB isn't supported by browsers, Markdown lacks a lot of features, etc. And than you have things like Google Docs, which are Internet aware, editable, shareable, but also completely proprietary and lock you into the Google ecosystem.

[–] [email protected] 14 points 1 year ago (3 children)

Epub isn't supported by browsers

So you want EPUB support in browser and you have the ultimate document file format?

load more comments (3 replies)
load more comments (5 replies)
[–] [email protected] 26 points 1 year ago* (last edited 1 year ago) (1 children)

~~XML for machine-readable data because I live to cause chaos~~

Either markdown or Org for human-readable text-only documents. MS Office formats and the way they are handled have been a mess since the 2007 -x versions were introduced, and those and Open Document formats are way too bloated for when you only want to share a presentable text file.

While we're at it, standardize the fucking markdown syntax! I still have nightmares about Reddit's degenerate four-space-indent code blocks.

[–] [email protected] 18 points 1 year ago (3 children)

Man, I'd love if markdown was more widely used, it's pretty much the perfect format for everything I do

load more comments (3 replies)
[–] [email protected] 26 points 1 year ago* (last edited 1 year ago)

I'd like an update to the epub ebook format that leverages zstd compression and jpeg-xl. You'd see much better decompression performance (especially for very large books,) smaller file sizes and/or better image quality. I've been toying with the idea of implementing this as a .zpub book format and plugin for KOReader but haven't written any code for it yet.

[–] [email protected] 22 points 1 year ago (12 children)

.opus for lossy music, .flac for lossless music, .png for image files, .mkv for video

[–] [email protected] 16 points 1 year ago (6 children)

All of them are OK, except mkv is less a file type and more a container. What should be specified is the code for video, which for most things I'd say AV1, but high res movies might not be the most suitable. Throw in opus for the audio track, and you can use mkv, but might as well use webm anyways since it's more clear what's behind it. (though can still be other things)

I'd also add that jxl should be the standard for lossy images. Better than jpg. And you want something other than png for massive images because that quickly gets costly in terms of size due to png being lossless.

load more comments (6 replies)
load more comments (11 replies)
[–] [email protected] 19 points 1 year ago

UTF-8 for plain text, trying to figure out the encoding, especially with older files/equipment/software is super annoying.

[–] [email protected] 19 points 1 year ago* (last edited 1 year ago) (4 children)

.gltf/.glb for models. It's way less of a headache than .obj and .dae, while also being way more feature rich than either.

Either that or .blend, because some things other than blender already support it and it'd make my life so much easier.

load more comments (4 replies)
[–] [email protected] 19 points 1 year ago (2 children)

TOML for configuration files

load more comments (2 replies)
[–] [email protected] 19 points 1 year ago (4 children)

matroska for media, we already have MKA for audio and MKV for video. An image container would be good too.

mp4 is more prone to data loss and slower to parse, while also being less flexible, despite this it seems to be a sort of pseudo standard.

(MP4, M4A, HEIF formats like heic, avif)

load more comments (4 replies)
[–] [email protected] 16 points 1 year ago (10 children)

Markdown for all rich text that doesn't need super fancy shit like latex

load more comments (10 replies)
[–] [email protected] 15 points 1 year ago (2 children)

JPEG XL for images because it compresses better than JPEG, PNG and WEBP most of the time.

XZ because it theoretically offers the highest compression ratio in most circumstances, and long decompression time isn't really an issue when the alternative is downloading a larger file over a slow connection.

Config files stored as serialized data structures instead of in plain text. This speeds up read times and removes the possibility of syntax or type errors. Also, fuck JSON.

I wish there were a good format for typesetting. Docx is closed and inflexible. LaTeX is unreadable, inefficient to type and hard to learn due to the inconsistencies that arise from its reliance on third-party packages and its lack of guidelines for their design.

load more comments (2 replies)
load more comments
view more: next ›