no code implementations • 18 Aug 2021 • Zhu Baozhou, Peter Hofstee, Jinho Lee, Zaid Al-Ars
To solve the two problems together, we initially propose an attention module for convolutional neural networks by developing an AW-convolution, where the shape of attention maps matches that of the weights rather than the activations.
no code implementations • 11 Sep 2020 • Zhu Baozhou, Peter Hofstee, Jinho Lee, Zaid Al-Ars
Inspired by the shortcuts and fractal architectures, we propose two Shortcut-based Fractal Architectures (SoFAr) specifically designed for BCNNs: 1. residual connection-based fractal architectures for binary ResNet, and 2. dense connection-based fractal architectures for binary DenseNet.