Skip to content

Task Decoupled Knowledge Distillation For Lightweight Face Detectors

License

Notifications You must be signed in to change notification settings

CASIA-IVA-Lab/TDKD

Repository files navigation

TDKD


Task Decoupled Knowledge Distillation For Lightweight Face Detectors (Accepted by ACM MM2020)

Abstract

We propose a knowledge distillation method for the face detection task. This method decouples the distillation task of face detection into two subtasks, i.e., the classification distillation subtask and the regression distillation subtask. We add the task-specific convolutions in the teacher network and add the adaption convolutions on the feature maps of the student network to generate the task decoupled features. Then, each subtask uses different samples in distilling the features to be consistent with the corresponding detection subtask. Moreover, we propose an effective probability distillation method to joint boost the accuracy of the student network.

image

Evaluation Results of RetinaFace on WiderFace[val] dataset

Model Easy Medium Hard
RetinaFace-Mobilenet0.25 (Student) 87.1% 85.7% 79.2%
RetinaFace-Mobilenet0.25 (TDKD) 88.9% 87.5% 81.5%

Installation

  1. To folder where you want to download this repo
cd /your/own/path/
  1. Run
git clone https://github.com/CASIA-IVA-Lab/TDKD.git
  1. Install dependencies:

Data

  1. Download the WIDER FACE dataset. [password: qisa]
  2. Organise the dataset directory as follows:
  ./data/widerface/
    train/
      images/
      label.txt
    val/
      images/
      wider_val.txt

Train

  1. In the code, we integrate the teacher model and the pre-trained model. So just run the following command to start distillation:
CUDA_VISIBLE_DEVICES=0 python train.py --mode distillation
  1. If you want to train the student network or the teacher network yourself, you can run the following command to start training:
#Train the student network
CUDA_VISIBLE_DEVICES=0 python train.py --mode student
#Train the teacher network
CUDA_VISIBLE_DEVICES=0 python train.py --mode teacher

Evaluation

  1. Generate the txt files of the result.
python test_widerface.py
  1. Evaluate txt results. The scripts refer to Here
cd ./widerface_evaluate
python setup.py build_ext --inplace
python evaluation.py

Reference

TDKD refers to the following projects:

@inproceedings{LiangZZJTW20,
  title = {Task Decoupled Knowledge Distillation For Lightweight Face Detectors},
  author = {Liang, Xiaoqing and Zhao, Xu and Zhao, Chaoyang and Jiang, Nanfei and Tang, Ming and Wang, Jinqiao},
  booktitle = {{MM} '20: The 28th {ACM} International Conference on Multimedia, Virtual Event / Seattle, WA, USA, October 12-16, 2020},
  year = {2020}
}

About

Task Decoupled Knowledge Distillation For Lightweight Face Detectors

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •