Skip to content

Commit

Permalink
[Docs]Fix docs about cfg (open-mmlab#184)
Browse files Browse the repository at this point in the history
* add docs about config comment

* fix blank

* fix comment

* fix comment

* fix comment

* fix comment
  • Loading branch information
VVsssssk authored Nov 12, 2021
1 parent 1c86b39 commit fa626a5
Show file tree
Hide file tree
Showing 2 changed files with 44 additions and 17 deletions.
27 changes: 17 additions & 10 deletions docs/tutorials/how_to_convert_model.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,16 +2,17 @@

<!-- TOC -->

- [How to convert model](#how-to-convert-model)
- [How to convert models from Pytorch to other backends](#how-to-convert-models-from-pytorch-to-other-backends)
- [Prerequisite](#prerequisite)
- [Usage](#usage)
- [Description of all arguments](#description-of-all-arguments)
- [Example](#example)
- [How to evaluate the exported models](#how-to-evaluate-the-exported-models)
- [List of supported models exportable to other backends](#list-of-supported-models-exportable-to-other-backends)
- [Reminders](#reminders)
- [FAQs](#faqs)
- [Tutorial : How to convert model](#how-to-convert-model)
- [How to convert models from Pytorch to BACKEND](#how-to-convert-models-from-pytorch-to-other-backends)
- [Prerequisite](#prerequisite)
- [Usage](#usage)
- [Description of all arguments](#description-of-all-arguments)
- [How to find the corresponding deployment config of a PyTorch model](#how-to-find-the-corresponding-deployment-config-of-a-pytorch-model)
- [Example](#example)
- [How to evaluate the exported models](#how-to-evaluate-the-exported-models)
- [List of supported models exportable to BACKEND](#list-of-supported-models-exportable-to-other-backends)
- [Reminders](#reminders)
- [FAQs](#faqs)

<!-- TOC -->

Expand Down Expand Up @@ -58,6 +59,12 @@ python ./tools/deploy.py \
- `--show` : Whether to show detection outputs.
- `--dump-info` : Whether to output information for SDK.

#### How to find the corresponding deployment config of a PyTorch model

1. Find model's codebase folder in `configs/ `. Example, convert a yolov3 model you need to find `configs/mmdet` folder.
2. Find model's task folder in `configs/codebase_folder/ `. Just like yolov3 model, you need to find `configs/mmdet/single-stage` folder.
3. Find deployment config file in `configs/codebase_folder/task_folder/ `. Just like deploy yolov3 model you can use `configs/mmdet/single-stage/single-stage_onnxruntime_dynamic.py`.

#### Example

```bash
Expand Down
34 changes: 27 additions & 7 deletions docs/tutorials/how_to_write_config.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,9 +19,10 @@ This tutorial describes how to write a config for model conversion and deploymen
- [3. How to write backend config](#3-how-to-write-backend-config)
- [Example](#example-4)
- [4. A complete example of mmcls on TensorRT](#4-a-complete-example-of-mmcls-on-tensorrt)
- [5. How to write model config](#5-how-to-write-model-config)
- [6. Reminder](#6-reminder)
- [7. FAQs](#7-faqs)
- [5. The name rules of our deployment config](#5-the-name-rules-of-our-deployment-config)
- [6. How to write model config](#6-how-to-write-model-config)
- [7. Reminder](#6-reminder)
- [8. FAQs](#7-faqs)

<!-- TOC -->

Expand Down Expand Up @@ -131,7 +132,7 @@ The backend config is mainly used to specify the backend on which model runs and
backend_config = dict(
type='tensorrt',
common_config=dict(
fp16_mode=False, log_level=trt.Logger.INFO, max_workspace_size=1 << 30)
fp16_mode=False, log_level=trt.Logger.INFO, max_workspace_size=1 << 30),
model_inputs=[
dict(
input_shapes=dict(
Expand Down Expand Up @@ -188,14 +189,33 @@ onnx_config = dict(
partition_config = None
```

### 5. How to write model config
### 5. The name rules of our deployment config

There is a specific naming convention for the filename of deployment config files.

```bash
(task name)_(partition)_(backend name)_(dynamic or static).py
```

- `task name`: Model's task type.
- `partition`: Optional, whether partition model is supported.
- `backend name`: Backend's name. Note if you use the quantization function, you need to indicate the quantization type. Just like `tensorrt_int8`.
- `dynamic or static`: Dynamic or static export. Note if the backend needs explicit shape information, you need to add a description of input size with `height x width` format. Just like `dynamic-512x1024-2048x2048`, it means that the min input shape is `512x1024` and the max input shape is `2048x2048`.

#### Example

```
single-stage_partition_tensorrt-int8_dynamic-320x320-1344x1344.py
```

### 6. How to write model config

According to model's codebase, write the model config file. Model's config file is used to initialize the model, referring to [MMClassification](https://github.com/open-mmlab/mmclassification/blob/master/docs/tutorials/config.md), [MMDetection](https://github.com/open-mmlab/mmdetection/blob/master/docs_zh-CN/tutorials/config.md), [MMSegmentation](https://github.com/open-mmlab/mmsegmentation/blob/master/docs_zh-CN/tutorials/config.md), [MMOCR](https://github.com/open-mmlab/mmocr/tree/main/configs), [MMEditing](https://github.com/open-mmlab/mmediting/blob/master/docs_zh-CN/config.md).

### 6. Reminder
### 7. Reminder

None

### 7. FAQs
### 8. FAQs

None

0 comments on commit fa626a5

Please sign in to comment.