Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[MODEL REQUEST] FLUX.1-dev #140

Open
BrickDesignerNL opened this issue Dec 7, 2024 · 2 comments
Open

[MODEL REQUEST] FLUX.1-dev #140

BrickDesignerNL opened this issue Dec 7, 2024 · 2 comments
Labels
Feature Request New feature or request

Comments

@BrickDesignerNL
Copy link

Is your feature request related to a problem? Please describe.
Good model for generating images

Details of model being requested

Additional context for requested model
Easier to get good results than default SD model.

@mestrona-3
Copy link

Hi @BrickDesignerNL, thank you for the feature request! We'll add it to our list of requested models. In the meantime, we'd encourage you to try it out via our BYOM (bring your own model) approach to https://app.aihub.qualcomm.com/jobs/. Please feel free to post in Slack if the job fails.

@mestrona-3 mestrona-3 added the Feature Request New feature or request label Dec 13, 2024
@BrickDesignerNL
Copy link
Author

BrickDesignerNL commented Dec 15, 2024

@mestrona-3 I love to, but the instructions there are limited. I do not see BYOM in the link you provide, I see the LLAMA and Whisper that I earlier 'created' via this repo.
But find it hard to understand what I should change and how in https://github.com/quic/ai-hub-models/tree/main/qai_hub_models/models/stable_diffusion_v2_1_quantized

To make https://app.aihub.qualcomm.com/jobs/?type=compile&ownerKind=user&ownerName=@
work with https://huggingface.co/black-forest-labs/FLUX.1-schnell/tree/main or https://huggingface.co/Comfy-Org/stable-diffusion-3.5-fp8 to convert it to int8 and make it work with https://github.com/quic/wos-ai-plugins/tree/main/plugins/gimp/stable-diffusion

Or even directly let safetensor files run on the NPU.
Is there a way to convert them to int8 for the NPU and keep them in the safetensor format (so not binary)?
I see someone did:

https://huggingface.co/Disty0/FLUX.1-dev-qint8

Would be nice to understand how to get it integrated into https://github.com/quic/wos-ai-plugins/tree/main/plugins/gimp/stable-diffusion and let it run on the NPU.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Feature Request New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants