Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Instructions for inferencing on AI PC do not work #245

Open
Jaylyn-Barbee opened this issue Jan 13, 2025 · 0 comments
Open

Instructions for inferencing on AI PC do not work #245

Jaylyn-Barbee opened this issue Jan 13, 2025 · 0 comments

Comments

@Jaylyn-Barbee
Copy link
Contributor

Hello,

The instructions listed here: https://github.com/microsoft/Phi-3CookBook/blob/main/md/03.Inference/AIPC_Inference.md#running-phi-3-with-intel-npu-acceleration-library for inferencing with Phi3 on AI PC do not work. Following the jupyter notebook, I get the error TypeError: NPUModel.from_pretrained() missing 1 required positional argument: 'config' when trying to set up the NPU Wrapper from pretrained model = npu_lib.NPUModelForCausalLM.from_pretrained( model_id, torch_dtype="auto", dtype=npu_lib.int4, trust_remote_code=True )
Updating this to have a CompilerConfig object still causes more errors as the pipeline method expects the model to have a .config which it does not have.

Let me know if you need more information. I tried running this in python 3.11.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant