no code implementations • 27 May 2024 • Friedemann Zenke, Axel Laborieux
Finally, we outline avenues for further research to understand the brain's superb continual learning abilities and harness similar mechanisms for artificial intelligence systems.
1 code implementation • 23 Apr 2024 • Julia Gygax, Friedemann Zenke
We find that the latter provides the missing theoretical basis for surrogate gradients in stochastic spiking neural networks.
1 code implementation • NeurIPS 2023 • Julian Rossbroich, Friedemann Zenke
How neuronal circuits achieve credit assignment remains a central unsolved question in systems neuroscience.
1 code implementation • 5 Sep 2023 • Axel Laborieux, Friedemann Zenke
Equilibrium propagation (EP) is a compelling alternative to the backpropagation of error algorithm (BP) for computing gradients of neural networks on biological or analog neuromorphic substrates.
1 code implementation • NeurIPS 2023 • Manu Srinath Halvagal, Axel Laborieux, Friedemann Zenke
To gain further theoretical insight into non-contrastive SSL, we analytically study learning dynamics in conjunction with Euclidean and cosine similarity in the eigenspace of closed-form linear predictor networks.
1 code implementation • 1 Sep 2022 • Axel Laborieux, Friedemann Zenke
Equilibrium propagation (EP) is an alternative to backpropagation (BP) that allows the training of deep neural networks with local learning rules.
1 code implementation • 21 Jun 2022 • Julian Rossbroich, Julia Gygax, Friedemann Zenke
Thus fluctuation-driven initialization provides a practical, versatile, and easy-to-implement strategy for improving SNN training performance on diverse tasks in neuromorphic engineering and computational neuroscience.
Ranked #8 on Audio Classification on SHD
2 code implementations • 30 May 2022 • Simon F Muller-Cleve, Vittorio Fra, Lyes Khacef, Alejandro Pequeno-Zurro, Daniel Klepatsch, Evelina Forno, Diego G Ivanovich, Shavika Rastogi, Gianvito Urgese, Friedemann Zenke, Chiara Bartolozzi
Spatio-temporal pattern recognition is a fundamental ability of the brain which is required for numerous real-world activities.
no code implementations • NeurIPS 2020 • Basile Confavreux, Friedemann Zenke, Everton Agnes, Timothy Lillicrap, Tim Vogels
Instead of experimental data, the rules are constrained by the functions they implement and the structure they are meant to produce.
no code implementations • 22 Oct 2020 • Friedemann Zenke, Emre O. Neftci
Neuromorphic hardware strives to emulate brain-like neural networks and thus holds the promise for scalable, low-power information processing on temporal data streams.
1 code implementation • ICML 2020 • Tianlin Liu, Friedemann Zenke
Deep neural networks have dramatically transformed machine learning, but their memory and energy demands are substantial.
no code implementations • 12 Jun 2020 • Benjamin Cramer, Sebastian Billaudelle, Simeon Kanya, Aron Leibfried, Andreas Grübl, Vitali Karasenko, Christian Pehle, Korbinian Schreiber, Yannik Stradmann, Johannes Weis, Johannes Schemmel, Friedemann Zenke
To rapidly process temporal information at a low metabolic cost, biological neurons integrate inputs as an analog sum but communicate with spikes, binary events in time.
no code implementations • 16 Oct 2019 • Benjamin Cramer, Yannik Stradmann, Johannes Schemmel, Friedemann Zenke
Spiking neural networks are the basis of versatile and power-efficient information processing in the brain.
Ranked #4 on Audio Classification on SHD
4 code implementations • 28 Jan 2019 • Emre O. Neftci, Hesham Mostafa, Friedemann Zenke
Spiking neural networks are nature's versatile solution to fault-tolerant and energy efficient signal processing.
1 code implementation • 31 May 2017 • Friedemann Zenke, Surya Ganguli
In summary, our results open the door to obtaining a better scientific understanding of learning and computation in spiking neural networks by advancing our ability to train them to solve nonlinear problems involving transformations between different spatiotemporal spike-time patterns.
5 code implementations • ICML 2017 • Friedemann Zenke, Ben Poole, Surya Ganguli
While deep learning has led to remarkable advances across diverse applications, it struggles in domains where the data distribution changes over the course of learning.