You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've been using various versions of Stable Diffusion with my 7800RX.
One popular method is lshqqytiger's SD fork that uses Direct-ml. Is something like that possible here? SD doesn't run great on AMD - it's kind of a hack, and even 16GB of VRAM isn't enough sometimes. The low memory version of this could be a game changer for AMD owners.
The text was updated successfully, but these errors were encountered:
It can. I was able to get it to use my AMD GPU by using pip to install the ROCm version of torch instead of the cuda118 version. pip install torch --index-url https://download.pytorch.org/whl/rocm6.0 (see pytorch.org)
You will have to use pip uninstall torch first in your venv to remove the one you have currently.
Unfortunately, my gpu only has 4G of memory, so even on the lowest settings I could never get an image to render before I got an 'out of memory' error. So I have been trying to get DemoFusion to try rendering on my CPU, I know it would be slower, but at least I would get something. I have already tried installing the 'CPU' version of torch, but have not been able to get a successful run yet. If anyone can tell me what to tweak to get it to use the CPU I would appreciate the help.
I've been using various versions of Stable Diffusion with my 7800RX.
One popular method is lshqqytiger's SD fork that uses Direct-ml. Is something like that possible here? SD doesn't run great on AMD - it's kind of a hack, and even 16GB of VRAM isn't enough sometimes. The low memory version of this could be a game changer for AMD owners.
The text was updated successfully, but these errors were encountered: