Sara Elkerdawy1, Mostafa Elhoushi2, Hong Zhang1, Nilanjan Ray1
1 Computing Science Departement, University of Alberta, Canada
2 Toronto Heterogeneous Compilers Lab, Huawei, Canada
Sample training code for CIFAR fo dynamic pruning with self-supervised mask.
[Project Page], [Paper CVPR22], [Poster], [Video]
FLOPs reduction vs accuracy drop from baselines for various dynamic and static models on ResNet34 ImageNet.virtualenv .envpy36 -p python3.6 #Initialize environment
source .envpy36/bin/activate
pip install -r req.txt # Install dependencies
sh job_baseline.sh #You can change model at line 5
sh job_dynamic.sh #You can change model at line 5 and threshold at line 40
dataset |
model |
mthresh |
mode |
Accuracy | FLOPS Reduction (%) |
---|---|---|---|---|---|
cifar10 |
vgg16-bn |
93.82% | Baseline | ||
0.92 | joint |
93.55% | 65% | ||
0.92 | decoupled |
93.73% | 56% | ||
0.85 | decoupled |
93.19% | 73% | ||
0.88 | joint |
92.65% | 74% | ||
resnet56 |
93.66% | Baseline | |||
0.80 | decoupled |
92.63% | 66% | ||
0.88 | joint |
92.28% | 54% | ||
mobilenetv1 |
90.89% | Baseline | |||
1.00 | decoupled |
91.06% | 78% | ||
1.00 | joint |
91.21% | 78% | ||
imagenet |
resnet34 |
73.30% | Baseline | ||
0.97 | decoupled |
73.25% | 25.86% | ||
0.95 | decoupled |
72.79% | 37.77% | ||
0.93 | decoupled |
72.17% | 47.42% | ||
0.92 | decoupled |
71.71% | 52.24% | ||
resnet18 |
69.76% | Baseline | |||
0.91 | decoupled |
67.49% | 51.56% | ||
mobilenetv1 |
69.57% | Baseline | |||
1.00 | decoupled |
69.66% | 41.07% |
@InProceedings{elkerdawy2022fire,
author = {Elkerdawy, Sara and Elhoushi, Mostafa and Zhang, Hong and Ray, Nilanjan},
title = {Fire Together Wire Together: A Dynamic Pruning Approach with Self-Supervised Mask Prediction},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2022},
}