this post was submitted on 27 Sep 2024
30 points (96.9% liked)

Hardware

521 readers
164 users here now

All things related to technology hardware, with a focus on computing hardware.


Rules (Click to Expand):

  1. Follow the Lemmy.world Rules - https://mastodon.world/about

  2. Be kind. No bullying, harassment, racism, sexism etc. against other users.

  3. No Spam, illegal content, or NSFW content.

  4. Please stay on topic, adjacent topics (e.g. software) are fine if they are strongly relevant to technology hardware. Another example would be business news for hardware-focused companies.

  5. Please try and post original sources when possible (as opposed to summaries).

  6. If posting an archived version of the article, please include a URL link to the original article in the body of the post.


Some other hardware communities across Lemmy:

Icon by "icon lauk" under CC BY 3.0

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 2 points 1 week ago (1 children)

No problem, but I mean if you're just tinkering around then you could do with even less memory as long as the model stays in it and you sample small pieces in small batches.

We all had P series gpus and we had to buy up because the trainees model didn't fit in 16GB (they had probably too much money) so I don't remember what card it was for the 24GB.

[–] [email protected] 0 points 1 week ago

For just tinkering around one could use SD1.5 with a 4GB VRAM GPU and stop after a few minutes. I spend quite some time on AI image generation, like on average 4 hours per day since over a year now. New models, especially video AI generation will need more VRAM, but since I don't do this commercially, I can't just pay 30k for a GPU.