A set of image attention layers implemented as custom keras layers that can be imported dirctly into keras
- Pixel Attention : Efficient Image Super-Resolution Using Pixel Attention(Hengyuan Zhao et al)
- Channel Attention : CBAM: Convolutional Block Attention Module(Sanghyun Woo et al)
- Efficient Channel Attention : ECA-Net: Efficient Channel Attention for Deep Convolutional Neural Networks
You can see the projects official pypi page : https://pypi.org/project/visual-attention-tf/
pip install visual-attention-tf
Use --no-dependencies if you have tensorflow-gpu installed already
from tensorflow.keras.models import Model
from tensorflow.keras.layers import Input, Conv2D
from visual_attention import PixelAttention2D , ChannelAttention2D,EfficientChannelAttention2D
inp = Input(shape=(1920,1080,3))
cnn_layer = Conv2D(32,3,,activation='relu', padding='same')(inp)
# Using the .shape[-1] to simplify network modifications. Can directly input number of channels as well
Pixel_attention_cnn = PixelAttention2D(cnn_layer.shape[-1])(cnn_layer)
Channel_attention_cnn = ChannelAttention2D(cnn_layer.shape[-1])(cnn_layer)
EfficientChannelAttention_cnn = EfficientChannelAttention2D(cnn_layer.shape[-1])(cnn_layer)