-
Notifications
You must be signed in to change notification settings - Fork 88
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
\Lumina-Next-T2I\config.json' is not a valid JSON file. #74
Comments
Hi @danieldietzel , |
Hi Pommes, I ran through the steps again and realized the second item in this config is supposed to be the LLM checkpoint, not the lumina model. https://github.com/Alpha-VLLM/Lumina-T2X/blob/main/lumina_next_t2i/configs/infer/settings.yaml Now my Yaml is:
But I get this: I am on Windows by the way if it helps. Had to change all |
This may not impact performance, but we have not tested whether it can run correctly on the Gloo backend. you could try running the mini version of Lumina-Next-T2I on https://github.com/Alpha-VLLM/Lumina-T2X/tree/main/lumina_next_t2i_mini |
I tried following the instructions here:
https://huggingface.co/Alpha-VLLM/Lumina-Next-T2I
And installed the repo via hugginface CLI and via github, in both cases I get this error:
[rank0]: OSError: It looks like the config file at '........\Lumina-T2X\Lumina-Next-T2I\config.json' is not a valid JSON file.
Using this command:
lumina_next infer -c "lumina_next_t2i/configs/infer/settings.yaml" "a snowman" "./outputs"
in my settings.yaml I've tried the local path, huggingface path, and repo download path.
The text was updated successfully, but these errors were encountered: