Skip to content

MadadamXie/PyCIL-G-U-N-s-first-commit-

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PyCIL: A Python Toolbox for Class-Incremental Learning


IntroductionMethods ReproducedResultsHow To UseLicenseAcknowledgementsCitation


LICENSEPython PyTorch method ![CIL](https://img.shields.io/badge/Class Incremental Learning-SOTA-success??style=for-the-badge&logo=appveyor)

Introduction

Traditional machine learning systems are deployed under the closed-world setting, which requires the entire training data before the offline training process. However, real-world applications often face the incoming new classes, and a model should incorporate them continually. The learning paradigm is called Class-Incremental Learning (CIL). We propose a Python toolbox that implements several key algorithms for class-incremental learning to ease the burden of researchers in the machine learning community. The toolbox contains implementations of a number of founding works of CIL such as EWC and iCaRL, but also provides current state-of-the-art algorithms that can be used for conducting novel fundamental research. This toolbox, named PyCIL for Python Class-Incremental Learning, is open source with an MIT license.

Methods Reproduced

  • FineTune: Baseline method which simply updates parameters on new task, suffering from Catastrophic Forgetting. By default, weights corresponding to the outputs of previous classes are not updated.
  • EWC: Gradient Episodic Memory for Continual Learning. [paper]
  • LwF: Learning without Forgetting. [paper]
  • Replay: Baseline method with exemplars.
  • GEM: Gradient Episodic Memory for Continual Learning. [paper]
  • iCaRL: Incremental Classifier and Representation Learning. [paper]
  • BiC: Large Scale Incremental Learning. [paper]
  • WA: Maintaining Discrimination and Fairness in Class Incremental Learning. [paper]
  • PODNet: PODNet: Pooled Outputs Distillation for Small-Tasks Incremental Learning. [paper]
  • DER: DER: Dynamically Expandable Representation for Class Incremental Learning. [paper]
  • Coil: Co-Transport for Class-Incremental Learning. [paper]

Reproduced Results

CIFAR100

Imagenet100

More experimental details and results are shown in our paper.

How To Use

Clone

Clone this github repository:

git clone https://github.com/G-U-N/PyCIL.git
cd PyCIL

Dependencies

  1. torch 1.81
  2. torchvision 0.6.0
  3. tqdm
  4. numpy
  5. scipy
  6. quadprog

Run experiment

  1. Edit the [MODEL NAME].json file for global settings.
  2. Edit the hyperparameters in the corresponding [MODEL NAME].py file (e.g., models/icarl.py).
  3. Run:
python main.py --config=./exps/[MODEL NAME].json

where [MODEL NAME] should be chosen from: finetune, ewc, lwf, replay, gem, icarl, bic, wa, podnet, der.

  1. hyper-parameters

When using PyCIL, you can edit the global parameters and algorithm-specific hyper-parameter in the corresponding json file.

These methods include:

  • memory-size: The total exemplar number in the incremental learning process. As�suming there are $K$ classes at current stage, the model will preserve $\left[\frac{memory-size}{K}\right]$ exemplar per class.
  • init-cls: The number of classes in the first incremental stage. Since there are different settings in CIL with a different number of classes in the first stage, our framework enables different choices to define the initial stage.
  • increment: The number of classes in each incremental stage $i$, $i$ > 1. By default, the number of classes per incremental stage is equivalent per stage.
  • convnet-type: The backbone network for the incremental model. According to the benchmark setting, ResNet32 is utilized for CIFAR100, and ResNet18 is utilized for ImageNet.
  • seed: The random seed adopted for shuffling the class order. According to the benchmark setting, it is set to 1993 by default.

Other parameters in terms of model optimization, e.g., batch size, optimization epoch, learning rate, learning rate decay, weight decay, milestone, temperature, can be modified in the corresponding Python file.

Datasets

We have implemented the pre-processing of CIFAR100, imagenet100 and imagenet1000. When training on CIFAR100, this framework will automatically download it. When training on imagenet100/1000, you should specify the folder of your dataset in utils/data.py.

    def download_data(self):
        assert 0,"You should specify the folder of your dataset"
        train_dir = '[DATA-PATH]/train/'
        test_dir = '[DATA-PATH]/val/'

License

Please check the MIT license that is listed in this repository.

Acknowledgements

We thank the following repos providing helpful components/functions in our work.

Citation

If there are any questions, please feel free to propose new features by opening an issue or contact with the author: Da-Wei Zhou([email protected]) and Fu-Yun Wang([email protected]). Enjoy the code.

About

PyCIL software package's first commit

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages