no code implementations • 14 Sep 2021 • Yinghan Long, Indranil Chakraborty, Gopalakrishnan Srinivasan, Kaushik Roy
Only data with high probabilities of belonging to hard classes would be sent to the extension block for prediction.
1 code implementation • ICLR 2020 • Nitin Rathi, Gopalakrishnan Srinivasan, Priyadarshini Panda, Kaushik Roy
We propose a hybrid training methodology: 1) take a converted SNN and use its weights and thresholds as an initialization step for spike-based backpropagation, and 2) perform incremental spike-timing dependent backpropagation (STDB) on this carefully initialized network to obtain an SNN that converges within few epochs and requires fewer time steps for input processing.
no code implementations • 5 Mar 2020 • Sourjya Roy, Priyadarshini Panda, Gopalakrishnan Srinivasan, Anand Raghunathan
Our results for VGG-16 trained on CIFAR10 shows that L1 normalization provides the best performance among all the techniques explored in this work with less than 1% drop in accuracy after pruning 80% of the filters compared to the original network.
no code implementations • 2 Mar 2020 • Jason M. Allred, Steven J. Spencer, Gopalakrishnan Srinivasan, Kaushik Roy
Spiking Neural Networks (SNNs) are being explored for their potential energy efficiency resulting from sparse, event-driven computations.
1 code implementation • CVPR 2020 • Bing Han, Gopalakrishnan Srinivasan, Kaushik Roy
We find that performance degradation in the converted SNN stems from using "hard reset" spiking neuron that is driven to fixed reset potential once its membrane potential exceeds the firing threshold, leading to information loss during SNN inference.
1 code implementation • 4 Jun 2019 • Wachirawit Ponghiran, Gopalakrishnan Srinivasan, Kaushik Roy
We propose reinforcement learning on simple networks consisting of random connections of spiking neurons (both recurrent and feed-forward) that can learn complex tasks with very little trainable parameters.
no code implementations • 15 Mar 2019 • Chankyu Lee, Syed Shakib Sarwar, Priyadarshini Panda, Gopalakrishnan Srinivasan, Kaushik Roy
Spiking Neural Networks (SNNs) have recently emerged as a prominent neural computing paradigm.
no code implementations • 11 Feb 2019 • Gopalakrishnan Srinivasan, Kaushik Roy
In addition, we introduce residual connections between the stacked convolutional layers to improve the hierarchical feature learning capability of deep SNNs.
no code implementations • 1 Jul 2018 • Amogh Agrawal, Akhilesh Jaiswal, Deboleena Roy, Bing Han, Gopalakrishnan Srinivasan, Aayush Ankit, Kaushik Roy
In this paper, we demonstrate how deep binary networks can be accelerated in modified von-Neumann machines by enabling binary convolutions within the SRAM array.
Emerging Technologies
no code implementations • 10 Mar 2017 • Priyadarshini Panda, Gopalakrishnan Srinivasan, Kaushik Roy
Brain-inspired learning models attempt to mimic the cortical architecture and computations performed in the neurons and synapses constituting the human brain to achieve its efficiency in cognitive tasks.
1 code implementation • 29 Sep 2016 • Akhilesh Jaiswal, Sourjya Roy, Gopalakrishnan Srinivasan, Kaushik Roy
The efficiency of the human brain in performing classification tasks has attracted considerable research interest in brain-inspired neuromorphic computing.
no code implementations • 27 Feb 2016 • Gopalakrishnan Srinivasan, Parami Wijesinghe, Syed Shakib Sarwar, Akhilesh Jaiswal, Kaushik Roy
Our analysis on a widely used digit recognition dataset indicates that the voltage can be scaled by 200mV from the nominal operating voltage (950mV) for practically no loss (less than 0. 5%) in accuracy (22nm predictive technology).