this post was submitted on 23 Jun 2023
29 points (100.0% liked)

Technology

37750 readers
205 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

Nitter thread from Julio Merino on application responsiveness in early 2000's Windows computers versus modern Windows computers. Videos available in linked thread.

Please remind me how we are moving forward. In this video, a machine from the year ~2000 (600MHz, 128MB RAM, spinning-rust hard disk) running Windows NT 3.51. Note how incredibly snappy opening apps is.

Now look at opening the same apps on Windows 11 on a Surface Go 2 (quad-core i5 processor at 2.4GHz, 8GB RAM, SSD). Everything is super sluggish.

For those thinking that the comparison was unfair, here is Windows 2000 on the same 600MHz machine. Both are from the same year, 1999. Note how the immediacy is still exactly the same and hadn’t been ruined yet.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 4 points 1 year ago (2 children)

It's not just the operating systems, it's also the way software is developed now. Those old windows applications were probably written in C++, which is only lightly abstracted over C, which is about as close as you're going to get to machine code without going into Assembly.

These days, you might have several layers of abstraction before you get to the assembly level. And those abstractions are probably also abstracted by third party libraries which might be chained to even more libraries, causing even more code to need to load and run. Then all of that might not ultimately even be machine code, it might be in a language like C# or Java where they're in an intermediate language that needs to be JIT compiled by a runtime, which also needs to be loaded and ran, before it can be executed. Then, that application might provide another layer of abstraction and run something in a browser-like instance, ala anything Electron based.

[–] [email protected] 2 points 1 year ago (1 children)

That's a good point. No abstraction is performance-neutral; they all have some scenarios where they perform fast and others where they are slow. We're witnessing the accumulation of hundreds of abstractions that may be poorly optimized or used for purposes outside of their optimal performance zones.

[–] [email protected] 4 points 1 year ago

No abstraction is performance-neutral

That's not true. Zero-cost abstractions are a key feature of C++ and Rust. For example, Rust Option<&T> compiles down to nothing more than a potentially-null pointer.