no code implementations • 18 Mar 2021 • Florian Rehm, Sofia Vallecorsa, Vikram Saletore, Hans Pabst, Adel Chaibi, Valeriu Codreanu, Kerstin Borras, Dirk Krücker
We employ the novel Intel low precision optimization tool (iLoT) for quantization and compare the results to the quantized model from TensorFlow Lite.
no code implementations • 3 Oct 2019 • Kushal Datta, Imtiaz Hossain, Sun Choi, Vikram Saletore, Kyle Ambert, William J. Godinez, Xian Zhang
In this work, we train a multi-scale convolutional neural network (M-CNN) to classify large biomedical images for high content screening in one hour.
no code implementations • 3 Jun 2019 • Aishwarya Bhandare, Vamsi Sripathi, Deepthi Karkada, Vivek Menon, Sun Choi, Kushal Datta, Vikram Saletore
In this work, we quantize a trained Transformer machine language translation model leveraging INT8/VNNI instructions in the latest Intel$^\circledR$ Xeon$^\circledR$ Cascade Lake processors to improve inference performance while maintaining less than 0. 5$\%$ drop in accuracy.
no code implementations • 10 May 2019 • Derya Cavdar, Valeriu Codreanu, Can Karakus, John A. Lockman III, Damian Podareanu, Vikram Saletore, Alexander Sergeev, Don D. Smith II, Victor Suthichai, Quy Ta, Srinivas Varadharajan, Lucas A. Wilson, Rengan Xu, Pei Yang
Neural machine translation - using neural networks to translate human language - is an area of active research exploring new neuron types and network topologies with the goal of dramatically improving machine translation performance.
no code implementations • 12 Nov 2017 • Valeriu Codreanu, Damian Podareanu, Vikram Saletore
Furthermore, as datasets grow, the representation learning potential of deep networks grows as well by using more complex models.