Skip to content

Code release for the CVPR'23 paper titled "PartDistillation Learning part from Instance Segmentation"

License

Notifications You must be signed in to change notification settings

janghyuncho/PartDistillation

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PartDistillation: Learning Parts from Instance Segmentation

PartDistillation learns to segment parts over 10k object categories without labels.

PartDistillation: Learning Parts from Instance Segmentation,
Jang Hyun Cho, Philipp Krähenbühl, Vignash Ramanathan,
CVPR 2023 [paper, project page]

Contact: [email protected]

🔥 News 🔥

  • PartDistillation demo is out!
  • ImageNet-1K training commands.
  • Initial commit.

Features

  • Unsupervised part segmentation using emergent part signals from strong instance segmentation model.
  • Open-vocabulary object-part segmentation (try out here).
  • Self-training to discover novel parts over 10K object classes (No part segmentation labels used!).
  • Strong zero-shot and few-shot performance.

Installation

Please see installation instructions.

DEMO

A short demo for PartDistillation with an image of a person and a bicycle:

Use the following command to segment each class:

python part_distillation_demo.py --input figs/input/bicycle_person.jpg --output figs/output/part_proposal/bicycle.jpg --vocabulary custom --confidence-threshold 0.1 --part-score-threshold 0.3 --custom_vocabulary bicycle --min-image-size 640 --non-overlapping
python part_distillation_demo.py --input figs/input/bicycle_person.jpg --output figs/output/part_proposal/person.jpg --vocabulary custom --confidence-threshold 0.1 --part-score-threshold 0.3 --custom_vocabulary person --min-image-size 640 --non-overlapping

If setup correctly, it should look like this:

Getting Started

See instructions for preparing datasets and preparing models to train PartDistillation.

Using PartDistillation

Please refer to our demo to explore. Also, see checkpoints and inference to learn about how to use PartDistillation.

Training PartDistillation

For now, we prepared compute-friendly training commands with ImageNet-1K dataset. This setting only requires a single 8-GPU node and matches the reported results in zero-shot and few-shot benchmarks.

The original training commands on ImageNet-21K here.

Benchmark Training and Evaluation

We have zero-shot and few-shot benchmarks on various datasets. Please see benchmark training and evaluation for detail.

License

Copyright (c) Meta Platforms, Inc. and affiliates.

This source code is licensed under the license found in the LICENSE file in the root directory of this source tree.

Citation

If you find this project useful for your research, please cite our paper using the following bibtex.

@InProceedings{Cho_2023_CVPR,
    author    = {Cho, Jang Hyun and Kr\"ahenb\"uhl, Philipp and Ramanathan, Vignesh},
    title     = {PartDistillation: Learning Parts From Instance Segmentation},
    booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
    month     = {June},
    year      = {2023},
    pages     = {7152-7161}
}

About

Code release for the CVPR'23 paper titled "PartDistillation Learning part from Instance Segmentation"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 89.9%
  • Cuda 9.0%
  • Other 1.1%