OutOfMemoryError : I think you are using inpainting with large image in batch or it's unrelated #31
TheWorldIsOur
started this conversation in
General
Replies: 1 comment
-
The extension uses RAM, not VRAM. I don't think this is directly related to its use. Unless you use a lot of upscaling or inpainting. If you use inpainting in batch mode and the images are very large then it's possible that it takes up VRAM linked to inpainting. For the workflow, there's already documentation, but it's true that there could be a more detailed methodology. I don't have much time to write this at the moment. I have 12Gb VRAM, i use medvram and xformers. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
i get this error: OutOfMemoryError: CUDA out of memory. Tried to allocate 11.39 GiB (GPU 0; 12.00 GiB total capacity; 14.32 GiB already allocated; 0 bytes free; 14.55 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
Time taken: 3m 56.40sTorch active/reserved: 14769/15466 MiB, Sys VRAM: 12288/12288 MiB (100.0%)
Is 12gb Vram (rtx 3060) and 16 ram not enough to run this extension?
And it would be awesome if there would be some kind of workflow or tutorial as i cant anything like this...
Thank you for any help and your work
Beta Was this translation helpful? Give feedback.
All reactions