Skip to content

[CVPR 2024] Offical implementation for A&B BNN: Add&Bit-Operation-Only Hardware-Friendly Binary Neural Network

Notifications You must be signed in to change notification settings

Ruichen0424/AB-BNN

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Dec 19, 2024
40cc239 · Dec 19, 2024

History

18 Commits
Dec 19, 2024
Mar 5, 2024
Dec 19, 2024
Dec 19, 2024
Dec 19, 2024
Dec 19, 2024
Dec 19, 2024
Dec 19, 2024
Dec 19, 2024

Repository files navigation

AB-BNN

This is the offical pytorch implementation of paper A&B BNN: Add&Bit-Operation-Only Hardware-Friendly Binary Neural Network published in CVPR 2024.

Poster

Paper Link:   ArxivCVPRGoogle ScholarIEEE
Video Link:   BilibiliYouTube

Abstract

Binary neural networks utilize 1-bit quantized weights and activations to reduce both the model's storage demands and computational burden. However, advanced binary architectures still incorporate millions of inefficient and nonhardware-friendly full-precision multiplication operations. A&B BNN is proposed to directly remove part of the multiplication operations in a traditional BNN and replace the rest with an equal number of bit operations, introducing the mask layer and the quantized RPReLU structure based on the normalizer-free network architecture. The mask layer can be removed during inference by leveraging the intrinsic characteristics of BNN with straightforward mathematical transformations to avoid the associated multiplication operations. The quantized RPReLU structure enables more efficient bit operations by constraining its slope to be integer powers of 2. Experimental results achieved 92.30%, 69.35%, and 66.89% on the CIFAR-10, CIFAR-100, and ImageNet datasets, respectively, which are competitive with the state-of-the-art. Ablation studies have verified the efficacy of the quantized RPReLU structure, leading to a 1.14% enhancement on the ImageNet compared to using a fixed slope RLeakyReLU. The proposed add&bit-operation-only BNN offers an innovative approach for hardware-friendly network architecture.

Citation

If you find our code useful for your research, please consider citing:

@InProceedings{Ma_2024_CVPR,
    author    = {Ma, Ruichen and Qiao, Guanchao and Liu, Yian and Meng, Liwei and Ning, Ning and Liu, Yang and Hu, Shaogang},
    title     = {A\&B BNN: Add\&Bit-Operation-Only Hardware-Friendly Binary Neural Network},
    booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
    month     = {June},
    year      = {2024},
    pages     = {5704-5713}
}

Requirements

  • python==3.8
  • pytorch==2.0.1

Pre-trained Model

The pre-trained models can be downloaded at Baidu Netdisk (pwd=abnn), and the codes for inference can be found in ./Inference. The following code can be used to check the md5 values ​​of all files.

md5sum -c md5.txt

Main Results

Dataset Structure # Params Top-1 Acc Downloadable
CIFAR10 ReActNet-18 11.18 M 91.94%
ReActNet-A 28.32 M 89.44%
CIFAR100 ReActNet-18 11.23 M 69.35%
ReActNet-A 28.41 M 63.23%
ImageNet ReActNet-18 11.70 M 61.39%
ReActNet-34 21.82 M 65.19%
ReActNet-A 29.33 M 66.89%

About

[CVPR 2024] Offical implementation for A&B BNN: Add&Bit-Operation-Only Hardware-Friendly Binary Neural Network

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published