Positive Competitive Networks for Sparse Reconstruction

We propose and analyze a continuous-time firing-rate neural network, the positive firing-rate competitive network (\pfcn), to tackle sparse reconstruction problems with non-negativity constraints. These problems, which involve approximating a given input stimulus from a dictionary using a set of sparse (active) neurons, play a key role in a wide range of domains, including for example neuroscience, signal processing, and machine learning. First, by leveraging the theory of proximal operators, we relate the equilibria of a family of continuous-time firing-rate neural networks to the optimal solutions of sparse reconstruction problems. Then, we prove that the \pfcn is a positive system and give rigorous conditions for the convergence to the equilibrium. Specifically, we show that the convergence: (i) only depends on a property of the dictionary; (ii) is linear-exponential, in the sense that initially the convergence rate is at worst linear and then, after a transient, it becomes exponential. We also prove a number of technical results to assess the contractivity properties of the neural dynamics of interest. Our analysis leverages contraction theory to characterize the behavior of a family of firing-rate competitive networks for sparse reconstruction with and without non-negativity constraints. Finally, we validate the effectiveness of our approach via a numerical example.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here