this post was submitted on 11 Jul 2023
50 points (98.1% liked)

Stable Diffusion

4256 readers
8 users here now

Discuss matters related to our favourite AI Art generation technology

Also see

Other communities

founded 1 year ago
MODERATORS
 

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 2 points 1 year ago (2 children)

That would be very cool. I am eying to buy a ARC A770 16GB for ML applications, but if I can keep it to 8/12GB VRAM a lot of second hand GPUs would also work.

[–] [email protected] 4 points 1 year ago (1 children)

Don't get an arc. ML stuff mostly relies on CUDA which is only supported by Nvidia cards.

[–] [email protected] 1 points 1 year ago (2 children)

Apparently Intel's oneAPI is catching up quickly to CUDA, at least compared to AMD's ROCm mess.

[–] [email protected] 2 points 1 year ago

Maybe, but I would be worried that I couldn't always run any application I wanted, or be able to try out brand new stuff because I have to wait on support to be added.