no code implementations • 6 Feb 2019 • Artem M. Grachev, Dmitry I. Ignatov, Andrey V. Savchenko
We propose a general pipeline for applying the most suitable methods to compress recurrent neural networks for language modeling.
no code implementations • 20 Aug 2017 • Artem M. Grachev, Dmitry I. Ignatov, Andrey V. Savchenko
In this paper, we consider several compression techniques for the language modeling problem based on recurrent neural networks (RNNs).