Fine tune ONNX Models (MMSegemetation) Inference for NVIDIA Jetson(2) 调优语义分割 ONNX 通用模型,使其适合NVIDIA Jetson 平台部署(2) #582
OpenMMLab-Assistant-004
started this conversation in
Other
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Fine tune ONNX Models (MMSegemetation) Inference for NVIDIA Jetson(2)
调优语义分割 ONNX 通用模型,使其适合NVIDIA Jetson 平台部署(2)
To deploy on the NVIDIA Jetson platform, you can follow these steps to optimize generic ONNX models to meet the specific requirements of the platform:
[https://platform.openmmlab.com/deploee/onnx-list](https://platform.openmmlab.com/deploee/onnx-list](https://platform.openmmlab.com/deploee/onnx-list))
Model Selection: Choose a generic ONNX model suitable for the Jetson platform as the base model.
Choose one of semanic segmentation models to deploy:
MMSegmentation:](https://mmdeploy.readthedocs.io/en/latest/04-supported-codebases/mmseg.html):) rtmpose, simcc, yolox-pose
Project requirements:
请从 语义分割 MMSegmentation](https://mmdeploy.readthedocs.io/en/latest/04-supported-codebases/mmseg.html)) 模型中选择 1 个:MMSegementation(分割): fastscnn, bisenetv2, stdc ,在 NVIDIA Jetson 平台上进行部署,可通过以下几个步骤来调优模型:
产出要求:
数据集和数据处理:介绍用于模型调优和测试的数据集,并解释数据的预处理过程。
模型调优:详细说明模型调优的过程,包括选择的基础模型、微调策略和性能优化方法。
模型部署:说明如何将调优后的模型部署到Jetson平台上
总结和评估:总结项目的成果,并评估模型在Jetson平台上的性能、效率和适应性。
硬件的使用可通过部署平台](https://platform.openmmlab.com/deploee?lang=zh-CN)) ,使用 Jetson Orin
Task Information
Repository:Other
Difficulty:Hard
Points:50
Estimated Time Consumption:21 days
Task Status
Claimant's Github ID:None
Claim Time:2023年08月23日
PR Link:None
Points to Note
To claim the task, please reply
Claim the mission
in the comment section. After claiming the task, please add our assistant bot called MeowMewo on WeChat at:openmmlabwx
.After completing the task, please submit a PR with the title starting with [MMSIG]. After submission, please reply with your PR link in the comments section of this Discussion. We will review your work as soon as possible.
在评论区回复
Claim the mission
领取任务,领取任务后请及时添加小助手喵喵微信:openmmlabwx
完成任务后提交相关链接(如是PR,标题请以[MMSIG]开头)回复到本Disucssion评论区,我们会第一时间审核您的贡献。
Beta Was this translation helpful? Give feedback.
All reactions