You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The instructions listed here: https://github.com/microsoft/Phi-3CookBook/blob/main/md/03.Inference/AIPC_Inference.md#running-phi-3-with-intel-npu-acceleration-library for inferencing with Phi3 on AI PC do not work. Following the jupyter notebook, I get the error TypeError: NPUModel.from_pretrained() missing 1 required positional argument: 'config' when trying to set up the NPU Wrapper from pretrained model = npu_lib.NPUModelForCausalLM.from_pretrained( model_id, torch_dtype="auto", dtype=npu_lib.int4, trust_remote_code=True )
Updating this to have a CompilerConfig object still causes more errors as the pipeline method expects the model to have a .config which it does not have.
Let me know if you need more information. I tried running this in python 3.11.
The text was updated successfully, but these errors were encountered:
Hello,
The instructions listed here: https://github.com/microsoft/Phi-3CookBook/blob/main/md/03.Inference/AIPC_Inference.md#running-phi-3-with-intel-npu-acceleration-library for inferencing with Phi3 on AI PC do not work. Following the jupyter notebook, I get the error
TypeError: NPUModel.from_pretrained() missing 1 required positional argument: 'config'
when trying to set up the NPU Wrapper from pretrainedmodel = npu_lib.NPUModelForCausalLM.from_pretrained( model_id, torch_dtype="auto", dtype=npu_lib.int4, trust_remote_code=True )
Updating this to have a
CompilerConfig
object still causes more errors as thepipeline
method expects themodel
to have a.config
which it does not have.Let me know if you need more information. I tried running this in python 3.11.
The text was updated successfully, but these errors were encountered: