no code implementations • 24 Apr 2024 • Lang Qin, ZiMing Wang, Runhao Jiang, Rui Yan, Huajin Tang
Spiking neural networks (SNNs) are widely applied in various fields due to their energy-efficient and fast-inference capabilities.
no code implementations • 19 Mar 2024 • ZiMing Wang, Ziling Wang, Huaning Li, Lang Qin, Runhao Jiang, De Ma, Huajin Tang
Event cameras, with their high dynamic range and temporal resolution, are ideally suited for object detection, especially under scenarios with motion blur and challenging lighting conditions.
no code implementations • NeurIPS 2023 • Qi Xu, Yuyuan Gao, Jiangrong Shen, Yaxin Li, Xuming Ran, Huajin Tang, Gang Pan
Spiking neural networks (SNNs) serve as one type of efficient model to process spatio-temporal patterns in time series, such as the Address-Event Representation data collected from Dynamic Vision Sensor (DVS).
no code implementations • 29 Dec 2023 • De Ma, Xiaofei Jin, Shichun Sun, Yitao Li, Xundong Wu, Youneng Hu, Fangchao Yang, Huajin Tang, Xiaolei Zhu, Peng Lin, Gang Pan
The Darwin3 chip supports up to 2. 35 million neurons, making it the largest of its kind in neuron scale.
no code implementations • 24 Dec 2023 • Zexiang Yi, Jing Lian, Yunliang Qi, Zhaofei Yu, Huajin Tang, Yide Ma, Jizhao Liu
In this work, we leverage a more biologically plausible neural model with complex dynamics, i. e., a pulse-coupled neural network (PCNN), to improve the expressiveness and recognition performance of SNNs for vision tasks.
no code implementations • 10 Nov 2023 • Shi Gu, Marcelo G Mattar, Huajin Tang, Gang Pan
While advances in artificial intelligence and neuroscience have enabled the emergence of neural networks capable of learning a wide variety of tasks, our understanding of the temporal dynamics of these networks remains limited.
no code implementations • 11 Sep 2023 • Huajin Tang, Pengjie Gu, Jayawan Wijekoon, MHD Anas Alsakkal, ZiMing Wang, Jiangrong Shen, Rui Yan
Neuromorphic computing holds the promise to achieve the energy efficiency and robust learning performance of biological neural systems.
no code implementations • 29 Aug 2023 • Xinyi Chen, Jibin Wu, Huajin Tang, Qinyuan Ren, Kay Chen Tan
The human brain exhibits remarkable abilities in integrating temporally distant sensory inputs for decision-making.
no code implementations • 21 Jun 2023 • Jinye Qu, Zeyu Gao, Tielin Zhang, YanFeng Lu, Huajin Tang, Hong Qiao
We also present a SNN-based ultra-low latency and high accurate object detection model (SUHD) that achieves state-of-the-art performance on nontrivial datasets like PASCAL VOC and MS COCO, with about remarkable 750x fewer timesteps and 30% mean average precision (mAP) improvement, compared to the Spiking-YOLO on MS COCO datasets.
no code implementations • NeurIPS 2023 • Gehua Ma, Runhao Jiang, Rui Yan, Huajin Tang
This work presents the temporal conditioning spiking latent variable models (TeCoS-LVM) to simulate the neural response to natural visual stimuli.
no code implementations • 21 Jun 2023 • Xundong Wu, Pengfei Zhao, Zilin Yu, Lei Ma, Ka-Wa Yip, Huajin Tang, Gang Pan, Tiejun Huang
Our comprehension of biological neuronal networks has profoundly influenced the evolution of artificial neural networks (ANNs).
no code implementations • 6 Jun 2023 • Jiangrong Shen, Qi Xu, Jian K. Liu, Yueming Wang, Gang Pan, Huajin Tang
To take full advantage of low power consumption and improve the efficiency of these models further, the pruning methods have been explored to find sparse SNNs without redundancy connections after training.
1 code implementation • 25 May 2023 • Gehua Ma, Rui Yan, Huajin Tang
Despite extensive research on spiking neural networks (SNNs), most studies are established on deterministic models, overlooking the inherent non-deterministic, noisy nature of neural computations.
no code implementations • 19 Apr 2023 • Qi Xu, Yaxin Li, Xuanye Fang, Jiangrong Shen, Jian K. Liu, Huajin Tang, Gang Pan
The proposed method explores a novel dynamical way for structure learning from scratch in SNNs which could build a bridge to close the gap between deep learning and bio-inspired neural dynamics.
no code implementations • CVPR 2023 • Qi Xu, Yaxin Li, Jiangrong Shen, Jian K Liu, Huajin Tang, Gang Pan
Spiking neural networks (SNNs) are well known as the brain-inspired models with high computing efficiency, due to a key component that they utilize spikes as information units, close to the biological neural systems.
1 code implementation • 21 Nov 2022 • Lang Qin, Rui Yan, Huajin Tang
In recent years, spiking neural networks (SNNs) have been used in reinforcement learning (RL) due to their low power consumption and event-driven features.
1 code implementation • 12 Oct 2022 • Lang Feng, Qianhui Liu, Huajin Tang, De Ma, Gang Pan
Spiking neural networks (SNNs) are bio-inspired neural networks with asynchronous discrete and sparse characteristics, which have increasingly manifested their superiority in low energy consumption.
1 code implementation • 26 Jul 2022 • Chaofei Hong, Mengwen Yuan, Mengxiao Zhang, Xiao Wang, Chegnjun Zhang, Jiaxin Wang, Gang Pan, Zhaohui Wu, Huajin Tang
In this work, we present a Python based spiking neural network (SNN) simulation and training framework, aka SPAIC that aims to support brain-inspired model and algorithm researches integrated with features from both deep learning and neuroscience.
1 code implementation • 16 May 2022 • ZiMing Wang, Shuang Lian, Yuhao Zhang, Xiaoxin Cui, Rui Yan, Huajin Tang
By evaluating on challenging datasets including CIFAR-10, CIFAR- 100 and ImageNet, the proposed method demonstrates the state-of-the-art performance in terms of accuracy, latency and energy preservation.
1 code implementation • 13 Dec 2021 • Guisong Liu, Wenjie Deng, Xiurui Xie, Li Huang, Huajin Tang
Specifically, we propose a directly-trained deep spiking reinforcement learning architecture based on the Leaky Integrate-and-Fire (LIF) neurons and Deep Q-Network (DQN).
no code implementations • 29 Sep 2021 • Gehua Ma, Jingyuan Zhao, Huajin Tang
POI vector representation (embedding) is the core of successive POI recommendation.
no code implementations • CVPR 2021 • Zehao Chen, Qian Zheng, Peisong Niu, Huajin Tang, Gang Pan
Image-based methods for indoor lighting estimation suffer from the problem of intensity-distance ambiguity.
no code implementations • 2 May 2020 • Qiang Yu, Shenglan Li, Huajin Tang, Longbiao Wang, Jianwu Dang, Kay Chen Tan
They are also believed to play an essential role in low-power consumption of the biological systems, whose efficiency attracts increasing attentions to the field of neuromorphic computing.
no code implementations • 14 Feb 2020 • Qianhui Liu, Haibo Ruan, Dong Xing, Huajin Tang, Gang Pan
Address event representation (AER) cameras have recently attracted more attention due to the advantages of high temporal resolution and low power consumption, compared with traditional frame-based cameras.
no code implementations • 19 Nov 2019 • Qianhui Liu, Gang Pan, Haibo Ruan, Dong Xing, Qi Xu, Huajin Tang
This paper proposes an unsupervised address event representation (AER) object recognition approach.
no code implementations • 4 Feb 2019 • Qiang Yu, Yanli Yao, Longbiao Wang, Huajin Tang, Jianwu Dang, Kay Chen Tan
Our framework is a unifying system with a consistent integration of three major functional parts which are sparse encoding, efficient learning and robust readout.
no code implementations • 28 Apr 2018 • Yangfan Hu, Huajin Tang, Gang Pan
SNNs theoretically have at least the same computational power as traditional artificial neural networks (ANNs).
no code implementations • 26 Feb 2015 • Xi Peng, Can-Yi Lu, Zhang Yi, Huajin Tang
A lot of works have shown that frobenius-norm based representation (FNR) is competitive to sparse representation and nuclear-norm based representation (NNR) in numerous tasks such as subspace clustering.
no code implementations • 22 Sep 2014 • Xi Peng, Rui Yan, Bo Zhao, Huajin Tang, Zhang Yi
Although the methods achieve a higher recognition rate than the traditional SPM, they consume more time to encode the local descriptors extracted from the image.
no code implementations • 25 Sep 2013 • Xi Peng, Huajin Tang, Lei Zhang, Zhang Yi, Shijie Xiao
In this paper, we propose a unified framework which makes representation-based subspace clustering algorithms feasible to cluster both out-of-sample and large-scale data.
no code implementations • 5 Sep 2012 • Xi Peng, Zhiding Yu, Huajin Tang, Zhang Yi
Under the framework of graph-based learning, the key to robust subspace clustering and subspace learning is to obtain a good similarity graph that eliminates the effects of errors and retains only connections between the data points from the same subspace (i. e., intra-subspace data points).