this post was submitted on 11 Jul 2023
50 points (98.1% liked)
Stable Diffusion
4256 readers
8 users here now
Discuss matters related to our favourite AI Art generation technology
Also see
Other communities
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
That would be very cool. I am eying to buy a ARC A770 16GB for ML applications, but if I can keep it to 8/12GB VRAM a lot of second hand GPUs would also work.
Don't get an arc. ML stuff mostly relies on CUDA which is only supported by Nvidia cards.
Apparently Intel's oneAPI is catching up quickly to CUDA, at least compared to AMD's ROCm mess.
Maybe, but I would be worried that I couldn't always run any application I wanted, or be able to try out brand new stuff because I have to wait on support to be added.