no code implementations • 21 Jun 2022 • Saeed Masiha, Amin Gohari, Mohammad Hossein Yassaee
In this paper, we provide three applications for $f$-divergences: (i) we introduce Sanov's upper bound on the tail probability of the sum of independent random variables based on super-modular $f$-divergence and show that our generalized Sanov's bound strictly improves over ordinary one, (ii) we consider the lossy compression problem which studies the set of achievable rates for a given distortion and code length.
no code implementations • 29 Sep 2021 • Alireza Masoumian, Shayan Kiyani, Mohammad Hossein Yassaee
Also, we offer an order-optimal algorithm to achieve this lower bound.
no code implementations • 10 Feb 2021 • Mohammad Saeed Masiha, Amin Gohari, Mohammad Hossein Yassaee, Mohammad Reza Aref
We study learning algorithms when there is a mismatch between the distributions of the training and test datasets of a learning algorithm.
Information Theory Information Theory