Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

It doesn't seem like it's just me who doesn't know where the llm model should go. #23

Open
chenpipi0807 opened this issue Jun 6, 2024 · 1 comment

Comments

@chenpipi0807
Copy link

I compared omost model storage path, seems to be different, if possible I prefer to put the model in the plug-in directory;
Maybe it's a bug, looking forward to solving it.

@ThereforeGames
Copy link

For me, the LLM downloaded automatically to C:/Users/name/.cache/huggingface/hub/models--lllyasviel--omost-llama-3-8b-4bits. I agree it would be better if it was sent to the ComfyUI models directory instead.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants