no code implementations • 16 Nov 2021 • Kai Liang, Huiru Zhong, Haoning Chen, Youlong Wu
Due to limited communication resources at the client and a massive number of model parameters, large-scale distributed learning tasks suffer from communication bottleneck.