Skip to content

(Unofficial) Data-Distortion Guided Self-Distillation for Deep Neural Networks (AAAI 2019)

License

Notifications You must be signed in to change notification settings

youngerous/ddgsd-pytorch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

May 12, 2021
d07d90c · May 12, 2021

History

11 Commits
Feb 14, 2021
Feb 14, 2021
Feb 24, 2021
Feb 14, 2021
Feb 14, 2021
Feb 13, 2021
May 12, 2021
Mar 14, 2021

Repository files navigation

Data-Distortion Guided Self-Distillation for Deep Neural Networks

Unofficial PyTorch Implementation of Data-Distortion Guided Self-Distillation for Deep Neural Networks (AAAI 2019)

Overview

'ddgsd'

Code Structure

src/
    └─ model/
        └─ net.py
    ├─ config.py
    ├─ dataset.py
    ├─ main.py
    ├─ trainer.py
    └─ utils.py
scripts/
    ├─ run_baseline.sh
    └─ run_ddgsd.sh
├─ .gitignore
├─ Dockerfile
├─ LICENSE
├─ README.md
└─ requirements.txt

Dependencies

  • torch==1.6.0
  • torchvision==0.7.0

All dependencies are written in requirements.txt, and you can also access through Dockerfile.

How to Run

Baseline

$ sh scripts/run_baseline.sh

DDGSD

$ sh scripts/run_ddgsd.sh

Results

Dataset Model Top-1 Error Top-5 Error Method
CIFAR-100 ResNet18 30.15% 9.58% Baseline
CIFAR-100 ResNet18 26.60% 8.36% DDGSD

* Hyperparameters of this implementation follow paper settings.
* MMD Loss is replaced with MSE Loss in this implementation.
* For another differences, check this issue.

About

(Unofficial) Data-Distortion Guided Self-Distillation for Deep Neural Networks (AAAI 2019)

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published