this post was submitted on 06 Feb 2024
63 points (86.2% liked)
Linux
48330 readers
667 users here now
From Wikipedia, the free encyclopedia
Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).
Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.
Rules
- Posts must be relevant to operating systems running the Linux kernel. GNU/Linux or otherwise.
- No misinformation
- No NSFW content
- No hate speech, bigotry, etc
Related Communities
Community icon by Alpár-Etele Méder, licensed under CC BY 3.0
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
That is kind of what I said with 'you're probably comparing different things'. However, my experience is entirely different. I've been using Linux for quite a while now. While not everything is perfect, I think it does the job reasonably well. On my laptop all the applications I chose work quite well. I'm currently within the GNOME desktop (since you mentioned that) but I can't really follow your woes. I've installed like 50 desktop applications, do development and stuff. And everything just works for me. I think they all apply the GTK theme I chose and most of them also honor the dark mode settings and tie into the system very well. I really can't complain about that part of Linux. Additionally I have the command line where everything ties into another superbly. It follows the unix philosophy. That means I have tools that are supposed to do one task, but do that task well. And then I have a simple means of connecting them, concatenating them and it makes things really easy. I don't know how Apple does stuff. I suppose you can also instruct it to find all the vacation pictures from 2021, transfer them to the external harddrive and then remove them from the laptop. Might be easier or more time consuming on a Mac. But it really shines if you do complex development stuff, prepare a complex development project, handle the dependencies, do automatic (unit and integration) tests and deploy it. Or stuff like that. I really can't live without that convenience.
And which kind of cohesion on the desktop are you missing and I don't? I mean sure, Apple has one big ecosystem with everything tied to it. It is convenient and easy as long as you're within that one ecosystem. And Linux for example doesn't sell an operating system and online services and software, and have a central software marketplace all at once. That is true. I sometimes like not to put all my eggs into one basket. But that is personal preference. How easy is it on Apple if you want to break free from these confines? Or if you want to use something that isn't integrated well there? Like a game that isn't available for Mac or you're forced to use Microsoft Access or other specific software for work? Or you want to watch Virtual Reality pornography and that happens to be something the app store cuts down on?
This is something I'd also like to see for Linux. Not Flatpak that gets on top and circumvents the system. But something that is baked into the system. I think you're mixing things up a bit. Flatpak isn't something tightly integrated into the system and it isn't Linux's default choice. While Apple's sandboxing is part of the system and a (default) way to run software. If we really compare that to Linux, Linux is entirely missing that kind of sandboxing and clean way of software distribution. A Linux program spreads its files all across several directories and if it's desktop software, it just runs with the user's permissions. That is the old-fashioned way of doing it. And since Linux is profoundly more diverse, it is really difficult to change this. SystemD, cgroups and so come along to address the permission aspect of that. But we're far away from achieving a proper sandboxing feature per default.
You just can't compare specifically Flatpak to the way Apple does it. It is something that is focused on decoupling things from the system, not integrate them. You'd have to compare that to a solution that decouples things on a Mac. Since Flatpak isn't available there, you'd probably compare that to a virtual machine. And if you now install Windows in a virtual machine on your Macbook, does it follow the Mac theme? No it doesn't. Does the browser on the Windows have access to your Mac password manager? No it doesn't. But this would be a better way of comparing the two things.
I would really like Linux to step up their game regarding a few things. Desktop application sandboxing and distribution is one thing. If you use Flatpak for this, the blame is on you. It is not the solution I'd like to see. And it is not the intended way of using Linux, so you can't really complain. We need a proper solution instead. And another thing I'd like to see is mobile apps. We also don't have anything that allows for things like connected standby. Android for example can receive chat messages and E-Mails while it's sleeping and the screen is off. So can iPhones. When I close the lid of my laptop or use Linux on my phone, it just stops receiving chat messages once the screen is off and the processor is on standby.
Flatpak was designed to fit a purpose. And the people who develop it have some motivation to do so. That might not align with your motivations. Maybe they wanted something to isolate things and you want something that connects things. That clashes. You might be using the wrong tool.
You don't have to upsell me Linux, I'm already sold however I do use both Linux, Windows and macOS at work and home and notice the subtle differences when it comes to a polished experience and cohesion - Windows is the worst on cohesion as expected.
I'm not talking about ecosystems, I'm talking about the small annoyances from icons that don't have a consistent looking across apps on Linux to for instance whenever I want to add a VLAN instead of doing it on GNOME Settings (
gnome-control-center
) I'm forced to usenm-connection-editor
that is a different application. Settings are kind of scattered around. The same happened with Wireguard VPNs for a while...It does use a lot of the same containerization technologies that Docker, LXC etc. use such as cgroups, namespaces and bind mounts. To me it seems more like the higher levels are missing pieces to facilitate communication between applications (be it protocols, code or documentation) and sometimes it is as simple as configuration.
Apple also has it, macOS is a UNIX system for what's worth and that's the reason why a large number of developers use it instead of Windows. The cli tools you use under Linux are most likely available for macOS as well via https://brew.sh/.
To be fair Apple actually does a decent job when it comes to connect things as they even created a programming language called Apple Script that as made specifically so you can automated macOS GUI applications easily. On Linux this can be done with strongwind (deprecated?), dogtail (dead?), xdotool but those tools are sloppy and hard to use.
In AppleScript you can access the native APIs of macOS GUIs and simulate a user clicking on buttons and menus, you can also tell it to record some action on the GUI and it will translate it to code.
Recently Apple even made their macOS GUIs automations available from JavaScript and you can do the exact same things you used to be able to do from AppleScript in JavaScript. They actually invested so much into that you can even build entire macOS desktop apps using the typical UI components and frameworks Apple provides with JavaScript.
There's a difference between macOS and iOS. What your described is what happens in iOS - you're required to use the store and whatnot, but under macOS you can get applications from anywhere you want like you do on Window and Linux.
Apple does enforce a LOT of separation. they call it sandboxed apps and it is all based on capabilities, you may enjoy reading this. Applications get their isolated space at
~/Library/Containers
and are not allowed to just write to any file system path they want.A sandboxed app may even think it is writing into a system folder for preference storage for example - but the system rewrites the path so that it ends up in the Container folder instead. For example under macOS apps typically write their data to
~/Library/Application Support
. A sandboxed app cannot do that - and the data is instead written beneath the~/Library/Containers/app-id
path for that app.And here's how good Apple is, any application, including 3rd party tools running inside your Terminal will be restricted:
I bet you weren't expecting that a simple
ls
would trigger the sandbox restrictions applied to the Terminal application. The best part is that instead of doing what Flatpak does (just blocking things and leaving the user unable to to anything) the system will prompt you for a decision.But okay I see your point. Still believe that Flatpak could've done a few things better just by looking at what Apple does.
Half of the success of Windows and macOS is the fact that they provide solid and stable APIs and development tools that “makes it easy” to develop to those platforms. Linux is very bad at that. If major pieces of an OS are constantly changing and it requires large re-works of the applications then developers are less likely to support it. To be fair the Linux situation might be even harder than that - there are no distribution “sponsored” IDE (like Visual Studio or Xcode) and userland API documentation, frameworks etc.
If Linux is able to provide those things we may even get proprietary software like Adobe on Linux because let's face it the lack of Adobe and others is also Linux’s fault, not only on those companies. It is really fucking hard to develop and support software for Linux when you’ve to deal with at least two major half-assed desktop environments (KDE and GNOME) and one of them decides to reinvent the wheel every now breaking APIs with little to no regard for software. To make things worse you’ll end up finding out that most of the time people are running KDE + a bunch of GNOME/GTK/libadwaita components creating a Frankenstein of a system because some specific App depends on said components.
Hehe, yeah I see. I can agree to a lot of that. Maybe I should try a Mac for once and for more than 20 minutes. I think I mostly read the iPhone stuff and shake my head. How they force developers to buy a Mac, restrict the whole iPhone ecosystem. I don't think I'd feel at home on a platform like that
Concerning the Macbooks: I've recently learned about their M2 and M3 Macbooks and their outstanding performance at some workloads. For example people doing machine learning (AI) stuff on them. And the numbers of tokens an LLM can process/generate on them is on a whole other level than what my Intel machine does. I think Apple did a good job with that hardware. However they cost so much more... I can get a very decent frame.work laptop with a modern Ryzen for $2.070 or buy a new Macbook for $3.400 with a bit less RAM and the same amount of storage. It'd be faster at a singular workload i'm somewhat interested in. But I'm not sure if it is worth that kind of money.
And I think I'm about to get old. I'm accustomed to how Linux works, I know my way around, have my workflow set up. I'm not sure if I can be bothered to learn something new... Exchange the little annoyances for something that requires me to adapt to an entirely new workflow... Maybe I'll try it anyways. See if there are cracked versions of MacOS that I can boot in a VM and see if I like it. I have to think about that.
Thank you for the discussion. I really don't see Flatpak as the pinnacle of software distribution. But Linux is constantly evolving. I'm pretty sure we'll someday get there for desktop applications. I think all the containerization stuff, CGroups and SystemD stuff is a good approach. It makes many things so much easier than they used to be. And I can spin up light containers, services and have them run with arbitrary permissions and environments on a server and all I need is a few lines of text. Sure, I configure the permissions and what they're allowed to access myself on the server. That can't be transferred directly to the desktop. We still need additional interfaces and especially ways to address what you said. Linux is a good desktop operating system, but there are some things that need to be solved better (or at all).
Since you mentioned that GUI application automatization. That is a crazy approach. I saw some CI pipelines using such tools to test GUI applications and web interfaces. Load XY, press TAB 4 times, hit enter, search for an element with Z in the name, press ALT+F, do something else and then do a screenshot... The whole thing looked completely mental (to me.) And I think there is something like that on Windows, too. I can only imagine things like that break easy and you're never able to change things if people actually rely on it. But I'm really not an expert on this. Might have valid use-cases. Or it's just a silly way of doing things.
Something I don't agree with is Windows and MacOS succeeding because of solid and stable APIs. Theoretically this might be the case for developers. For Windows desktop end-users it is certainly not the case. My family threw out several printers because after an Windows Update there were no drivers available any more. Most of my old games don't work any more, I've tried. Installing the old dotnet or c++ runtimes and directx versions is a hassle, sometimes impossible. Some games crap out entirely. I can't do it the other way around and install an old version of Windows on modern hardware. So while in theory the Windows Kernel API might enjoy a good development model, it has little to zero effect on the end-user and why they buy Windows-Laptops in large quantities. And if success at the market is the measurement, contrary to Windows, Linux is the dominant operating system on servers an very successful there. So I don't think this is the real reason. But reliable interfaces is certainly something we want. Apple changed the entire kind of processor architecture, and then again. With them things also don't stay the same. They solve that with other techniques. And a Macbook won't be thrown to the garbage after a few years because it's gotten so slow. I see people keeping them for quite some time. But they usually don't run the latest version of MacOS any more. At least that's what I've seen.
Anyway, it's getting kind of late here. Thanks for the comment and the additional info you linked. I'm going to read the links tomorrow.
You don't need any cracks. The issue with running macOS on a VM is that the VM won't provide a compatible GPU and it will lag a lot. Yes, it's painful and there aren't decent workarounds unless you can passthrough an entire GPU supported natively by macOS.
Yes, there's vTask (proprietary) and AutoIt for Windows. The second one is very good and very reliable.
AutoIt doesn't break as much as you think if the developer knows what he's doing. "Unfortunately" I spent the better part of 2010 coding AutoIt to automate exporting data from a very proprietary Siemens software and after a few months you just learn how to it properly :P It can target the Win32 controls directly and you can bind code to UI events by their internal Windows IDs. Another interesting thing it can do (sometimes) is explore a program's DDLs and internal funcional calls and call those directly from your code instead of button clicks.
What Apple does with AppleScript is a less advanced version of AutoIt, you can call their framework's functions directly from it (hance the ability to build entire applications) and interact in robust ways with the GUI of some application. Applications can also load "plugins" into the editor and provide methods to do certain tasks the developer decided that might be important for someone.
In the macOS land the use case is allowing anyone without much coding experience to be able to automate some GUI task. While not perfect this a large win for a lot of people, specially because you can just click "record" > do your repetitive task > "finish" and it will translate the task into code - the best part is that this "record" feature doesn't actually record click positions, it will actually find out the IDs of the buttons and menus you clicked and write optimized and reliable code for the task.
In my case with the Siemens software the use-case was very simple: we needed access to data that was only made available either through their software that was about 300€/month OR with a special license and another tool (that provided a local API via socket with the data) that would cost around 50 000 €/month - when you see a price like that I believe it's totally justifiable and okay to automate the UI. Note that this was in 2010 and from what I've been told my code is still running the same task today without changes (AutoIt is complied and they don't even have the source). I believe this speaks volumes about how reliable AutoIt can be.
And and developers create software that people use. Large companies, without being given stable APIs and good documentation won't ever feel like developing for Linux. They couldn't justify a very expensive development process with large maintenance costs for such a small market share. If the APIs were more stable and there were better frameworks it could be easier to justify.
It's not just about the kernel, it's about the higher level APIs and frameworks that make developers be able to develop quickly. It's about having C# and knowing the thing is very well supported at any corner of Windows and whatnot. It's about having entire SDKs with everything integrated on a IDE made by them where everything works at the first try.
It seems you're picking the hard case - games. But for instance you can install Office 2003 and Photoshop 6 on Windows 11 and they'll run without hacks - Linux desktop (not CLI) never offered this kind of long term support. Recently I had an experience with an old game on modern Windows that might interest you: https://lemmy.world/post/10112060.
Apple simply obliterates the old and doesn't care much about it, Microsoft usually is way better at this. BUT... still as you've noticed their Rosetta 2 compatibility layer allows you to run Intel software on ARM machines without issues, even games and heavy stuff.
Yes, they've restrictions because they usually want to cleanup their kernel and some system components of support for older hardware and this seems to be a big advantage when it comes to the performance and reliability of their OS. Either way those machines with older macOS versions keep working and getting at least most of the software for a reasonable time.
Yes, Googled a bit and found how to virtualize macOS, the install did the first reboot already. Seems they took inspiration from Scotty from the TOS Enterprise, it suggested 2h50 at first but the minutes are coming down fast.
We'll see about that graphics accelleration. The laptop doesn't have a dedicated GPU anyways. Either QEMU/KVM does it or I can pass through half the intel iGPU or it'll just be slow.
I can empathize with your story about the GUI automation. Sometimes you just need a solution for your problem. If it's still running more than 10 years later it probably was the right call. Sometimes crazy workarounds stick and do the trick. You can always calculate if buying software/a license or paying someone to come up with a solution is cheaper. 13x12x50.000€ is a good amount of money.
It just gets a bit messy once you're forced to re-work a hacked together solution in production. But it really depends on the circumstances. I've seen old machines that did crazy jobs and broke down or had to be integrated into something else at some point. And then you have an 10 year old operating system you can't change much on, the employer who cobbled together that solution had long left and the company who initially sold the expensive and specialized software/hardware had changed the product twice in the meantime... Might turn a few of your hair grey, especially if someone absolutely needs to use it on Wednesday, but somehow it usually works out. If it's tastefully done and documented, everything might be perfectly alright.
Thank you for the Midtown Madness 2 link. I need that, too. Spent quite some time in that blocky version of San Francisco when I was a kid.
I don't really have a better use case for Windows on my laptop at home. I use it to update stuff like the GPS and probably one or two other things. I moved a few games there after the SSD with Linux on it was filled up.
(Edit: The install is done. You were right, the desktop is totally sluggish and I don't have any sound. And I skipped the AppleID. I've closed it for now. Maybe I can find better settings on the weekend and try to install something on it.)
Assuming you're a GPU supported by macOS you might be able to get good results by treating with like an hackintosh: https://dortania.github.io/OpenCore-Install-Guide/
I'm not sure how macOS plays with GVT-g / SR-IOV / sharing slices of hardware but this guy says he go it to work https://www.reddit.com/r/VFIO/comments/innriq/successful_macos_catalina_with_intel_gvtg/. I personally never got macOS with GPU acceleration working fine on a VM because my host is NVIDIA and unsupported. However I did have very good results in HP Mini computers running macOS by following the links before.
I believe the hacks work with other games from that time as well as it solves the DirectX and GPU issues nicely without permanente changes to your system.