no code implementations • 27 Mar 2024 • Qingyu Wang, Duzhen Zhang, Tilelin Zhang, Bo Xu
Energy-efficient spikformer has been proposed by integrating the biologically plausible spiking neural network (SNN) and artificial Transformer, whereby the Spiking Self-Attention (SSA) is used to achieve both higher accuracy and lower computational cost.