nightsky

joined 2 months ago
[–] [email protected] 10 points 2 weeks ago* (last edited 2 weeks ago) (2 children)

The ongoing trend of "flat UI" is largely not due to processing power though. Even inexpensive computers have CPUs and GPUs that could push very fancy graphics without problems, see what the same machines can do in game graphics (and I don't mean high-end gaming, I mean the kind of simple gaming that can run on a low-end laptop these days). Some of the early GUIs in the 1980s had "flat design" due to performance limitations, but that went away in the 1990s. Today it could still be a reason in some embedded system scenarios with simple microcontrollers, but not in a desktop or laptop computer, and also not in smartphones or tablets.

The reason we have the bland flat design is the same why we still have things like "all surfaces are ugly glossy black plastic" (luckily this one is on its way out) or "war on physical buttons" aka "touchscreens everywhere"... it's simply a design trend.

[–] [email protected] 19 points 4 weeks ago

Was browsing ebay, looking for some piece of older used consumer electronics. Found a listing where the description text was written like crappy ad copy. Cheap over-the-top praising the thing. But zero words about the condition of the used item, i.e. the actually important part was completely missing. And then at the end of the description it said... this description text was generated by AI.

AI slop is like mold, it really gets everywhere and ruins everything.

[–] [email protected] 5 points 1 month ago (1 children)

I like the idea. Or maybe marking such changes in the commit message... I might try to bring that up when the time comes.

[–] [email protected] 9 points 1 month ago* (last edited 1 month ago) (3 children)

Ugh, from me as well: sorry to hear that.

I can relate to how you feel about the AI stuff. I also work for GenAI-pilled upper management, and the forced introduction of github copilot is coming soon. It will make us all super extra productive! ...they say. Dreading it already. I won't use it at all, I've already made that clear to my superior. But my colleagues might use it, and then I will have to review the AI slop... uggghh...

Maybe a small silver lining to raise the mood here, recent article from Monday: Gartner sounds alarm on AI cost, data challenges

If even freaking Gartner is now saying "well, maybe AI is too expensive and not actually so useful"... then maybe the world of management will wisen up as well, soon, hopefully, maybe?

[–] [email protected] 12 points 1 month ago (2 children)

FastCompany: "In Apple’s new ads for AI tools, we’re all total idiots"

It's interesting that not even Apple, with all their marketing knowledge, can come up with anything convincing why users might need "Apple Intelligence"[1]. These new ads are not quite as terrible as that previous "Crush" AI ad, but especially the one with the birthday... I find it just alienating.

Whatever one may think about Apple and their business practices, they are typically very good at marketing. So if even Apple can't find a good consumer pitch for GenAI crap, I don't think anyone can.

[1] I'd like to express support for this post from Jeff Johnson to call it "iSlop"

[–] [email protected] 15 points 1 month ago (2 children)

teased by an OpenAI executive as potentially up to 100 times more powerful

"potentially up to 100 times" is such a peculiar phrasing too... could just as well say "potentially up to one billion trillion times!"

[–] [email protected] 14 points 1 month ago (14 children)

Ah, so apparently Google has found a new way to make Youtube comments worse.

[–] [email protected] 4 points 1 month ago (1 children)

Projects having a self-appointed "BDFL" has become kind of a red flag for me in general. I know the term is used somewhat tongue-in-cheek, but still I find it really offputting. Ruins the vibes.

Has happened just recently that I found an interesting project, was excited about it and even thought about becoming a contributor eventually... until I saw that its founder calls themselves "BDFL", and then I just noped out.

[–] [email protected] 20 points 1 month ago (2 children)

Today I was looking at buying some stickers to decorate a laptop and such, so I was browsing Redbubble. Looking here and there I found some nice designs and then stumbled upon a really impressive artist portfolio there. Thousands of designs, woah, I thought, it must have been so much work to put that together!

Then it dawned on me. For a while I had completely forgotten that we live in the age of AI slop... blissfull ignorance! But then I noticed the common elements in many of the designs... noticed how everything is surrounded by little dots or stars or other design trinkets. Such a typical AI slop thing, because somehow these "AI" generators can't leave any whitespace, they must fill every square millimeter with something. Of course I don't know for sure, and maybe I'm doing an actual artist injustice with my assumption, but this sure looked like Gen-AI stuff...

Anyway, I scrapped my order for now while I reconsider how to approach this. My brain still associates sites like redbubble or etsy with "art things made by actual humans", but I guess that certainty is outdated now.

This sucks so much. I don't want to pay for AI slop based on stolen human-created art - I want to pay the actual artists. But now I can never know... How can trust be restored?

[–] [email protected] 14 points 1 month ago (5 children)

Using tools from physics to create something that is popular but unrelated to physics is enough for the nobel prize in physics?

So, if say a physicist creates a new recipe for the world's greatest potato casserole, and it becomes popular everywhere, and they used some physics for creating the recipe to calculate the best heat distribution or whatever, then that's enough?

[–] [email protected] 21 points 1 month ago (3 children)

I wonder if this signals being at peak hype soon. I mean, how much more outlandish can they get without destroying the hype bubble's foundation, i.e. the suspension of disbelief that all this would somehow become possible in the near future. We're on the level of "arrival of an alien intelligence" now, how much further can they escalate that rhetoric without popping the bubble?

[–] [email protected] 0 points 1 month ago (6 children)

So, today MS publishes this blog post about something with AI. It starts with "We’re living through a technological paradigm shift."... and right there I didn't bother reading the rest of it because I don't want to expose my brain to it further.

But what I found funny is that also today, there's this news: https://www.theverge.com/2024/10/1/24259369/microsoft-hololens-2-discontinuation-support

So Hololens is discontinued... you know... AR... the last supposedly big paradigm shift that was supposedly going to change everything.

view more: next ›