Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

是否有关于支持custom layer QDQ INT8 explicitly quantization 相关的教程或示例? #13

Open
PeiqinSun opened this issue Sep 1, 2022 · 1 comment

Comments

@PeiqinSun
Copy link

非常感谢这个中文化的tutorial系列!
我想请问一下, 您是否有关于如何部署自定义算子显式int8量化(即基于ONNX的QDQ形式)的经验?

@LitLeo
Copy link
Owner

LitLeo commented Sep 1, 2022

你好,我没用过QDQ形式。我都是转换float32格式的模型,然后用trt做int8.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants