no code implementations • 26 May 2024 • Regev Cohen, Idan Kligvasser, Ehud Rivlin, Daniel Freedman
In this paper, we employ information-theory tools to investigate this phenomenon, revealing a fundamental tradeoff between uncertainty and perception.
no code implementations • 19 May 2024 • Omer Belhasin, Idan Kligvasser, George Leifman, Regev Cohen, Erin Rainaldi, Li-Fang Cheng, Nishant Verma, Paul Varghese, Ehud Rivlin, Michael Elad
Analyzing the cardiovascular system condition via Electrocardiography (ECG) is a common and highly effective approach, and it has been practiced and perfected over many decades.
no code implementations • 17 May 2023 • Idan Kligvasser, George Leifman, Roman Goldenberg, Ehud Rivlin, Michael Elad
By integrating the local metric over the withdrawal phase, we build a global, offline quality metric, which is shown to be highly correlated to the standard Polyp Per Colonoscopy (PPC) quality metric.
no code implementations • 3 Dec 2022 • Idan Kligvasser, Tamar Rott Shaham, Noa Alkobi, Tomer Michaeli
Training a generative model on a single image has drawn significant attention in recent years.
no code implementations • NeurIPS 2021 • Idan Kligvasser, Tamar Shaham, Yuval Bahat, Tomer Michaeli
Features extracted from deep layers of classification networks are widely used as image descriptors.
no code implementations • 3 Mar 2021 • Idan Kligvasser, Tomer Michaeli
Generative adversarial networks (GANs) are known to benefit from regularization or normalization of their critic (discriminator) network during training.
1 code implementation • 27 Nov 2018 • Idan Kligvasser, Tomer Michaeli
For example, on ImageNet, our DxNet outperforms a ReLU-based DenseNet having 30% more parameters and achieves state-of-the-art results for this budget of parameters.
1 code implementation • CVPR 2018 • Idan Kligvasser, Tamar Rott Shaham, Tomer Michaeli
However, state-of-the-art results are typically achieved by very deep networks, which can reach tens of layers with tens of millions of parameters.