RTMPose: got different results by using SDK Python API and demo of inferencing with onnxruntime in Python after model deployed #2887
Replies: 1 comment
-
Solved. I forget to modify the padding factor in https://github.com/open-mmlab/mmpose/blob/main/projects/rtmpose/examples/onnxruntime/main.py |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi,
I'm trying to inference with onnxruntime in Python of RTMPose model trained by my own dataset, but find https://github.com/open-mmlab/mmpose/blob/main/projects/rtmpose/examples/onnxruntime/main.py gives a wrong result which is different from the prediction of torch model. However, when I use the example of SDK Python API from https://github.com/open-mmlab/mmpose/tree/main/projects/rtmpose, the result is the same as the prediction of torch model.
Any hints for this problem or for the inference of RTMPose model by onnxrumtime only rather than using mmlab packages?Thanks a lot~
Below is my config for training:
Below is my command of model deployment:
Beta Was this translation helpful? Give feedback.
All reactions