Recently, researchers proposed pruning deep neural network weights (DNNs) using an
This repository is partially based on convNet.pytorch repo. please ensure that you are using pytorch 1.7+. Reproducing AdaPrune results
cd AdaPrune
sh scripts/adaprune_dense_bnt.sh
sh scripts/adaprune_sparse.sh
Reproducing static NM-transposable starting from dense pre-trained model:
cd static_TNM
sh scripts/prune_pretrained_R50.sh
Reproducing dynamic NM-transposable from scratch:
cd dynamic_TNM
sh scripts/clone_and_copy.sh
sh scripts/run_R18.sh
sh scripts/run_R50.sh