No Stupid Questions
No such thing. Ask away!
!nostupidquestions is a community dedicated to being helpful and answering each others' questions on various topics.
The rules for posting and commenting, besides the rules defined here for lemmy.world, are as follows:
Rules (interactive)
Rule 1- All posts must be legitimate questions. All post titles must include a question.
All posts must be legitimate questions, and all post titles must include a question. Questions that are joke or trolling questions, memes, song lyrics as title, etc. are not allowed here. See Rule 6 for all exceptions.
Rule 2- Your question subject cannot be illegal or NSFW material.
Your question subject cannot be illegal or NSFW material. You will be warned first, banned second.
Rule 3- Do not seek mental, medical and professional help here.
Do not seek mental, medical and professional help here. Breaking this rule will not get you or your post removed, but it will put you at risk, and possibly in danger.
Rule 4- No self promotion or upvote-farming of any kind.
That's it.
Rule 5- No baiting or sealioning or promoting an agenda.
Questions which, instead of being of an innocuous nature, are specifically intended (based on reports and in the opinion of our crack moderation team) to bait users into ideological wars on charged political topics will be removed and the authors warned - or banned - depending on severity.
Rule 6- Regarding META posts and joke questions.
Provided it is about the community itself, you may post non-question posts using the [META] tag on your post title.
On fridays, you are allowed to post meme and troll questions, on the condition that it's in text format only, and conforms with our other rules. These posts MUST include the [NSQ Friday] tag in their title.
If you post a serious question on friday and are looking only for legitimate answers, then please include the [Serious] tag on your post. Irrelevant replies will then be removed by moderators.
Rule 7- You can't intentionally annoy, mock, or harass other members.
If you intentionally annoy, mock, harass, or discriminate against any individual member, you will be removed.
Likewise, if you are a member, sympathiser or a resemblant of a movement that is known to largely hate, mock, discriminate against, and/or want to take lives of a group of people, and you were provably vocal about your hate, then you will be banned on sight.
Rule 8- All comments should try to stay relevant to their parent content.
Rule 9- Reposts from other platforms are not allowed.
Let everyone have their own content.
Rule 10- Majority of bots aren't allowed to participate here.
Credits
Our breathtaking icon was bestowed upon us by @Cevilia!
The greatest banner of all time: by @TheOneWithTheHair!
Or we work lul
A bubble means that investors are putting in more money into a particular field than the field is really worth. How does that happen? Well, investors make money by investing money into small companies and hoping that they get bigger over time. And they need to make guesses in which company they think will actually get big. While investors generally try to make these guesses logically, there's inherently a bit of "trust me bro" involved in making these decisions.
A bubble happens when investors increasingly rely on "trust me bro" to make their investment decisions. And so they put in more and more money into a field that might not really need or deserve that much money. Not to say that the field is intrinsically useless - just that the hype has overtaken the actual usefulness of that field. So when you see something that's being hyped up, you should generally view it with caution.
AI as a field is currently very hyped up right now, and so there's concern that AI might be a bubble.
How does a bubble pop? Randomly and without warning. The problem with bubbles is that they're driven primarily by hype and "trust me bro," and so if anything blows the hype, it will cause all the investors to snap back to reality and pull all their money. That's a lot of money being pulled from a single field at the same time, and that'll absolutely crash the field. A company going under might trigger a pop, or it could be a random news article that went viral saying that AI is a fraud, or it could be a lackluster product launch. Hype is inherently unstable, and so it can be difficult to predict when and why a bubble pops.
The implosion that happens during a pop isn't referring to any particular company, it's referring to the entire field as a whole. It could very well happen (though unlikely) that not a single company goes bankrupt during a pop. It's merely that those companies would lose a lot of the investor funding that they have previously been relying on. As investors lose hype in AI, companies will no longer feel as strong of a push to include AI in their products. At the same time, AI companies will slow down their product development due to lower funding and so they won't be able to make as big of a splash in the news when they launch a new product.
The observed effect is that one day everything is AI, and the next day, nothing is AI. Think about NFT's and cryptocurrency - most companies that dealt with NFT's and crypto survived, but we no longer hear about NFT's because they lost their hype and so lost their funding
Super informative, thanks!
Somewhat relevant, I would like to recommend this one: https://thedailywtf.com/articles/classif-wtf-the-virtudyne-saga
It's a fun read about a company that grew because of the 90's IT bubble, and how it all came crashing down when the bubble burst and it was obvious that the investors would probably never get any return on their investment.
Colloquially, it means we'll hopefully stop seeing "AI" shoved into every nook and cranny of every piece of software to tick a buzzword box.
there's a lot of AI based startups promising their investors all sorts of outlandish results. eventually it's going to become clear that those results are not coming and the investors will come calling.
It's extra frustrating for those of us working in companies where ML is actually a useful tool that we've been leveraging for years.
I remember when I was arguing "It's not 'learning', it's algebra".
Now the hype has gone from learning to intelligence.
It's still algebra.
Well yeah - but technically everything is just algebra.
If it happens on a computer, it can be expressed in s expressions.
Sure, but I do think there's a difference between your message arriving here, and MNIST recognition algorithms.
Yes. They don't have to be public companies for investors to lose their shirts, and employees to lose their jobs.
Investors have invested lots of money into these companies. This means in some form or another these companies have agreed to pay back these investors in some way. You can answer this by quite literally thinking of money like a river, and the motion of that river is what gives energy to businesses so they can do their things.
In a normal not-bubble market, there is a flow of cash that goes from investors, into the company, and then back out to investors so they can do other things with it.
In a bubble market a lot of cash is flowing into the company, but little or no cash is flowing out back to investors. There are two possible things that happen here, either the cash eventually starts flowing again and we're all good back to normal after some stabilization period, or people stop pumping cash into the business and the dam breaks. All that money is lost, or all that potential business energy is lost, or some combination of the two no matter how you slice it it's wasted effort.
To keep with the water metaphor the AI market is like a hose that's wound up in a box we can't see into. We've pumped a ton of water into this hose and haven't seen anything come out the other end. There could be a leak somewhere, or maybe we don't have enough water to even get through the hose and people will want to use their water for other things instead. One thing we do know is that we've devoted so much water to this operation that if something does go wrong it has to go wrong spectacularly.
The big players will likely continue to develop this tech. "The Bubble" is more about the marketing and speculative investing in anything with "AI" tacked onto it. There's no reality where everything AI is being ham fisted into is going to be successful. There will be some winners, but there will be a lot of losers when the bubble bursts.
In the meantime, we have to put up with every company and product marketing that they now have "AI" in their product, whether it's actually useful or different from it was before.
It's the new squarespace for Uber AI on the blockchain, using ASIC GPT miners.
If anyone actually knew WHEN it would crash they wouldn't tell be saying anything and instead placing bets via the stock market.
Back when Blockchain was first a huge hype bubble, there were companies that added "Blockchain" to their name, or announced a pivot into Blockchain tech, and watched their stock value soar by a few hundred percent (with market value being many times their revenue).
I had googled a list of news articles, until I found this:
https://www.sciencedirect.com/science/article/abs/pii/S0165176519301703
A noteworthy example: https://www.cnbc.com/2017/12/21/long-island-iced-tea-micro-cap-adds-blockchain-to-name-and-stock-soars.html
Anyway.
That's the bubble.
Over-valuation. People taking advantage of the hype. People jumping on any opportunity to "not be left out" or to "get in early".
AI has uses.
Everyone is throwing things at the wall to seeing what sticks. Not much of it will.
Marketing are capitalising on the hype.
I think of AI as Beanie Babies. All the rage (excitement, not anger) and will eventually (hopefully) die on the vine when full AGI never comes.
I don't understand what people mean with "full AGI"
Meaning it's no an LLM. There's no intelligence with current "ai" it's pattern recognition. The same thing as machine learning with a different name. It doesnt actually understand anything it spits out
You agree with me. But what do people that don't agree with me imagine when they think of AGI?
The American stock market is hugely weighted by the top 5 or so companies, all of whom if I remember have jumped hugely in value based on ai (nvidia, microsoft, amazon, apple, meta) so if it turns out/investors decide there isn't a way to make ai profitable, those valuations tumble as does the American stock market and likely the world's.