no code implementations • CMCL (ACL) 2022 • Joshua Bensemann, Alex Peng, Diana Prado, Yang Chen, Neset Tan, Paul Michael Corballis, Patricia Riddle, Michael Witbrock
Attention describes cognitive processes that are important to many human phenomena including reading.
1 code implementation • 13 Oct 2023 • Qiming Bao, Gael Gendron, Alex Yuxuan Peng, Wanjun Zhong, Neset Tan, Yang Chen, Michael Witbrock, Jiamou Liu
Despite their high performance on the original publicly available datasets, we find that all models perform poorly on these newly constructed datasets.
1 code implementation • 2 Aug 2023 • Tim Hartill, Neset Tan, Michael Witbrock, Patricia J. Riddle
We equip a smaller Language Model to generalise to answering challenging compositional questions that have not been seen in training.
1 code implementation • 21 May 2023 • Qiming Bao, Alex Yuxuan Peng, Zhenyun Deng, Wanjun Zhong, Gael Gendron, Timothy Pistotti, Neset Tan, Nathan Young, Yang Chen, Yonghua Zhu, Paul Denny, Michael Witbrock, Jiamou Liu
Combining large language models with logical reasoning enhances their capacity to address problems in a robust and reliable manner.
1 code implementation • 28 Jul 2022 • Qiming Bao, Alex Yuxuan Peng, Tim Hartill, Neset Tan, Zhenyun Deng, Michael Witbrock, Jiamou Liu
In our model, reasoning is performed using an iterative memory neural network based on RNN with a gated attention mechanism.