no code implementations • 1 Oct 2022 • Cheng Tang
There has been a rapid development and interest in adversarial training and defenses in the machine learning community in the recent years.
no code implementations • 27 Dec 2020 • Cheng Tang, Andrew Arnold
Recently, Nogueira et al. [2019] proposed a new approach to document expansion based on a neural Seq2Seq model, showing significant improvement on short text retrieval task.
no code implementations • NeurIPS 2019 • Cheng Tang
We present Matrix Krasulina, an algorithm for online k-PCA, by generalizing the classic Krasulina's method (Krasulina, 1969) from vector to matrix case.
no code implementations • NeurIPS 2018 • Cheng Tang, Damien Garreau, Ulrike Von Luxburg
As a consequence, even highly randomized trees can lead to inconsistent forests if no subsampling is used, which implies that some of the commonly used setups for random forests can be inconsistent.
no code implementations • ICLR 2018 • Cheng Tang, Claire Monteleoni
Auto-encoders are commonly used for unsupervised representation learning and for pre-training deeper neural networks.
no code implementations • 16 Nov 2016 • Cheng Tang, Claire Monteleoni
We analyze online \cite{BottouBengio} and mini-batch \cite{Sculley} $k$-means variants.
no code implementations • 16 Oct 2016 • Cheng Tang, Claire Monteleoni
We analyze online and mini-batch k-means variants.