We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
You can continue the conversation there. Go to discussion →
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
请提供下述完整信息以便快速定位问题/Please provide the following information to quickly locate the problem
我想要通过paddleocr的量化训练finetune后的模型转onnx然后再通过其他方式去推理,比如RKNN,但是遇到问题如下
paddle2onnx --model_dir ./model/ch_PP-OCRv3_det_slim_infer --model_filename inference.pdmodel --params_filename inference.pdiparams --save_file ./model/rec_onnx/model.onnx --opset_version 13 然后通过predict_det 选择 使用onnx 预测结果不对。如果将onnx_format设为False,再转onnx模型结果正常,为什么?
The text was updated successfully, but these errors were encountered:
麻烦提供下复现命令吧
Sorry, something went wrong.
python predict_det.py --use_gpu=True --use_onnx=True --det_model_dir=./output/det/infer/model.onnx --image_dir=./data/ccpd_green/test 就是把QTA量化训练好的模型通过deploy/slim/quantization/export_model.py,转onnx, 如果onnx_format=False, 转出来的onnx就会是带有fake_quantize的,再predict_det的话会报错说是个量化模型,转onnx的时候把onnx_format=True,然后再用predict_det上面的命令去运行的话能运行就是结果都不对。
python
这个是paddle2onnx不支持转换量化后的模型导致的,麻烦在这个链接下问一下吧
zhangyubo0722
No branches or pull requests
请提供下述完整信息以便快速定位问题/Please provide the following information to quickly locate the problem
问题描述
我想要通过paddleocr的量化训练finetune后的模型转onnx然后再通过其他方式去推理,比如RKNN,但是遇到问题如下
The text was updated successfully, but these errors were encountered: