Diversity Based Edge Pruning of Neural Networks Using Determinantal Point Processes

Deep learning architectures with huge number of parameters are often compressed using pruning techniques. Two classes of pruning techniques are node pruning and edge pruning. A fairly recent work established that Determinantal Point Process (DPP) based node pruning empirically outperforms competing node pruning methods. However, one prominent appeal of edge pruning over node pruning is the consistent finding in literature is that sparse neural networks (edge pruned) generalize better than dense neural networks (node pruned). Building on these previous work and drawing motivation from synaptic diversity in the brain, we propose a novel diversity-based edge pruning technique for neural networks using DPP. We then empirically show that DPP edge pruning for neural networks outperforms other competing methods (both edge and node) on real data.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods