no code implementations • 11 Jul 2020 • Hung-Yi Lee, Cheng-Hao Ho, Chien-Fu Lin, Chiung-Chih Chang, Chih-Wei Lee, Yau-Shian Wang, Tsung-Yuan Hsu, Kuan-Yu Chen
Conventional seq2seq chatbot models attempt only to find sentences with the highest probabilities conditioned on the input sequences, without considering the sentiment of the output sentences.
1 code implementation • ICLR 2020 • Fan-Keng Sun, Cheng-Hao Ho, Hung-Yi Lee
We present LAMOL, a simple yet effective method for lifelong language learning (LLL) based on language modeling.
Ranked #4 on Continual Learning on ASC (19 tasks)