no code implementations • 6 Jan 2024 • Siavash Golkar, Jules Berman, David Lipshutz, Robert Mihai Haret, Tim Gollisch, Dmitri B. Chklovskii
Such variation in the temporal filter with input SNR resembles that observed experimentally in biological neurons.
1 code implementation • NeurIPS 2023 • Lyndon R. Duong, Eero P. Simoncelli, Dmitri B. Chklovskii, David Lipshutz
Neurons in early sensory areas rapidly adapt to changing sensory statistics, both by normalizing the variance of their individual responses and by reducing correlations between their responses.
no code implementations • 20 Feb 2023 • David Lipshutz, Yanis Bahroun, Siavash Golkar, Anirvan M. Sengupta, Dmitri B. Chklovskii
These NN models account for many anatomical and physiological observations; however, the objectives have limited computational power and the derived NNs do not explain multi-compartmental neuronal structures and non-Hebbian forms of plasticity that are prevalent throughout the brain.
1 code implementation • 27 Jan 2023 • Lyndon R. Duong, David Lipshutz, David J. Heeger, Dmitri B. Chklovskii, Eero P. Simoncelli
Statistical whitening transformations play a fundamental role in many computational systems, and may also play an important role in biological sensory systems.
no code implementations • 14 Nov 2022 • Siavash Golkar, David Lipshutz, Tiberiu Tesileanu, Dmitri B. Chklovskii
However, the performance of cPCA is sensitive to hyper-parameter choice and there is currently no online algorithm for implementing cPCA.
no code implementations • 21 Sep 2022 • David Lipshutz, Cengiz Pehlevan, Dmitri B. Chklovskii
To this end, we consider two mathematically tractable recurrent linear neural networks that statistically whiten their inputs -- one with direct recurrent connections and the other with interneurons that mediate recurrent communication.
no code implementations • 30 Nov 2020 • Siavash Golkar, David Lipshutz, Yanis Bahroun, Anirvan M. Sengupta, Dmitri B. Chklovskii
The backpropagation algorithm is an invaluable tool for training artificial neural networks; however, because of a weight sharing requirement, it does not provide a plausible model of brain function.
1 code implementation • NeurIPS 2020 • David Lipshutz, Charlie Windolf, Siavash Golkar, Dmitri B. Chklovskii
Furthermore, when trained on naturalistic stimuli, SFA reproduces interesting properties of cells in the primary visual cortex and hippocampus, suggesting that the brain uses temporal slowness as a computational principle for learning latent features.
1 code implementation • 23 Oct 2020 • David Lipshutz, Cengiz Pehlevan, Dmitri B. Chklovskii
To model how the brain performs this task, we seek a biologically plausible single-layer neural network implementation of a blind source separation algorithm.
no code implementations • NeurIPS 2020 • Siavash Golkar, David Lipshutz, Yanis Bahroun, Anirvan M. Sengupta, Dmitri B. Chklovskii
Here, adopting a normative approach, we model these instructive signals as supervisory inputs guiding the projection of the feedforward data.
1 code implementation • 1 Oct 2020 • David Lipshutz, Yanis Bahroun, Siavash Golkar, Anirvan M. Sengupta, Dmitri B. Chklovskii
For biological plausibility, we require that the network operates in the online setting and its synaptic update rules are local.