this post was submitted on 20 Sep 2023
299 points (84.5% liked)

Technology

59107 readers
5344 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

The summer is over, schools are back, and the data is in: ChatGPT is mainly a tool for cheating on homework.::ChatGPT traffic dropped when summer began and schools closed. Now students are back, and they're using the AI tool again more.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 3 points 1 year ago (2 children)

If anything, people who like chatGPT are the ones ignorant of how it works (spoiler: it doesn't).

And kids with their understanding of technology being limited to youtube and tiktok have no clue about what an AI is. They see it, like most people, as a magic black box that is incredibly smart. Apart from being a black box, none of that is true.

[–] [email protected] -1 points 1 year ago (1 children)

It works well if you know how to prompt well. LLMs are able to do so much, but a user just needs to know how to use it correctly. The technology is still in its infancy, so it's a bit difficult to use well.

[–] [email protected] 2 points 1 year ago (1 children)

It does not work. At most it looks like it works.

A human brain is able to understand and process information. An AI simply calculates a mathematical function. There is no reasoning and no understanding of anything, all that ChatGPT does is try to look like a human. And by "try to look like a human", I mean "generate sentences that can be believable, on shape, to be written by a human".

If you ask it to calculate 2+2, and then tell it that it is equal to 5, it won't see any problem because it doesn't understand any of it. But it will give you answers that are, grammatically speaking, reasonably human.

If I ask a rock how much is 2+2 and I throw it, and it bounces 4 times, it does not mean that this rock knows how to count. ChatGPT and similar are just better illusions, but they're nothing more.

[–] [email protected] -2 points 1 year ago (1 children)

It does not work. At most it looks like it works.

No it does work, it's just most people don't understand its just a very complex lookup table essentially using a predictive model. It doesn't think, nor feel, nor imagine, it runs a function to choose the next fitting word based on previous inputs and it's training set. In that regard it is damn good.

[–] [email protected] 3 points 1 year ago (1 children)

Yes but that's not what people think. They use it as a search engine, a programmer and a teacher. This is my problem, they use it as something it absolutely isn't and base their opinions, knowledge and work on that.

[–] [email protected] 1 points 1 year ago

The only issue with it is needing to teach people to fact check back with 1st party sources. Teachers are not always absolute, documentation is not always absolute, and search engines are certainly not always absolute especially with SEO. Fact checking is always a necessity whether using generative AI solutions or not. You need to take a step back, this improves efficiency when used correctly and you won't teach proper use cases if not brought up early with students in school. You are arguing against yourself here.

[–] [email protected] -1 points 1 year ago

Literally unusable.