no code implementations • 9 Jun 2022 • Zhenwei Dai, Vasileios Ioannidis, Soji Adeshina, Zak Jost, Christos Faloutsos, George Karypis
ScatterSample employs a sampling module termed DiverseUncertainty to collect instances with large uncertainty from different regions of the sample space for labeling.
no code implementations • 23 Oct 2021 • Zhenwei Dai, Chen Dun, Yuxin Tang, Anastasios Kyrillidis, Anshumali Shrivastava
Federated learning enables many local devices to train a deep learning model jointly without sharing the local data.
no code implementations • 29 Sep 2021 • Zichao Wang, Weili Nie, Zhenwei Dai, Richard Baraniuk
Many existing approaches either require extensive training/fine-tuning of the LM for each single attribute under control or are slow to generate text.
no code implementations • NeurIPS 2020 • Zhenwei Dai, Anshumali Shrivastava
Recent work suggests improving the performance of Bloom filter by incorporating a machine learning model as a binary classifier.
no code implementations • NeurIPS 2020 • Zhenwei Dai, Anshumali Shrivastava
Recent work suggests improving the performance of Bloom filter by incorporating a machine learning model as a binary classifier.
no code implementations • 22 Jul 2019 • Zhenwei Dai, Reinhard Heckel
This effect prevails in deep single-channel linear convolutional networks, and we show that without channel normalization, gradient descent takes at least exponentially many steps to come close to an optimum.