no code implementations • 17 Aug 2019 • Yang Liu, Jianpeng Zhang, Chao GAO, Jinghua Qu, Lixin Ji
In this paper, we investigate the effect of different hyperparameters as well as different combinations of hyperparameters settings on the performance of the Attention-Gated Convolutional Neural Networks (AGCNNs), e. g., the kernel window size, the number of feature maps, the keep rate of the dropout layer, and the activation function.
no code implementations • 10 Aug 2019 • Yang Liu, Jianpeng Zhang, Chao GAO, Jinghua Qu, Lixin Ji
Activation functions play a key role in providing remarkable performance in deep neural networks, and the rectified linear unit (ReLU) is one of the most widely used activation functions.
2 code implementations • 22 Aug 2018 • Yang Liu, Lixin Ji, Ruiyang Huang, Tuosiyu Ming, Chao GAO, Jianpeng Zhang
The classification of sentences is very challenging, since sentences contain the limited contextual information.