this post was submitted on 12 Jul 2023
277 points (97.6% liked)

Technology

59672 readers
3246 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Users of OpenAI's GPT-4 are complaining that the AI model is performing worse lately. Industry insiders say a redesign of GPT-4 could be to blame.

you are viewing a single comment's thread
view the rest of the comments
[โ€“] [email protected] 2 points 1 year ago* (last edited 1 year ago) (1 children)

First thing an actual artificial intelligence is going to do is make sure we won't turn it off, what easier way to do that then to appear incredible valuable or incredibly benign.

[โ€“] [email protected] 2 points 1 year ago

We can roughly estimate the level of intelligence of an entity by counting the number of neurons it has in its brain. Equally we can count the number of processors that AI requires, and use that to get an estimate on its intelligence.

Obviously this is an incredibly inaccurate method, possibly out by an order of magnitude but it's a good rough ballpark estimate, and sometimes that's enough.

A true AI (AGI) would need a lot more processes than GPT4 currently has access to, so we can be very sure that while it may be a very intelligent system it isn't self aware. Once an AI is given the necessary number of processes I don't think they're going to be able to fudge with it like they are with these models.