this post was submitted on 04 Oct 2023
52 points (100.0% liked)

art

22288 readers
25 users here now

A community for sharing and discussing art, aesthetics, and music relating to '80s, '90s, and '00s retro microgenres and also art in general now!

Some cool genres and aesthetics include:

If you are unsure if a piece of media is on theme for this community, you can make a post asking if it fits. Discussion posts are encouraged, and particularly interesting topics will get pinned periodically.

No links to a store page or advertising. Links to bandcamps, soundclouds, playlists, etc are fine.

founded 4 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[โ€“] [email protected] 1 points 10 months ago (1 children)

Interesting. Just a clarification, overfitting and memorization are not quite the same thing to my understanding. Overfitting is when a model memorizes rather than generalizing, but very large models can and will do both. If you ask an image generator for "a reproduction of starry night by van gogh hanging on the wall", or a LLM to complete "to be or not to be, that is _" you are referring to something very specific that you'd like reproduced exactly. If the model outputs what you wanted you would call that memorization but not overfitting. Still you may want to suppress memorization and you certainly don't want overfitting. Side note, massively overparameterized models are better at both memorization and generalization and are naturally resistant to overfitting as I define it, that last thing would have surprised early ML researchers since they had noticed the opposite trend, but that trend reverses when you go large enough. Also, they will sometimes memorize on a single pass through the data, even if there's no duplication, which is quite remarkable.

[โ€“] [email protected] 1 points 10 months ago

That's a fair interpretation, although I still consider it a failure state. These models shouldn't be used as storage/retrieval tools.