no code implementations • Findings (NAACL) 2022 • Diego Ortiz, Jose Moreno, Gilles Hubert, Karen Pinel-Sauvagnat, Lynda Tamine
Recent years have witnessed a growing interest towards learning distributed query representations that are able to capture search intent semantics.
1 code implementation • COLING 2022 • Jesus Lovon-Melgarejo, Jose G. Moreno, Romaric Besançon, Olivier Ferret, Lynda Tamine
Despite the success of state-of-the-art pre-trained language models (PLMs) on a series of multi-hop reasoning tasks, they still suffer from their limited abilities to transfer learning from simple to complex tasks and vice-versa.
no code implementations • 15 Dec 2023 • Jesús Lovón-Melgarejo, Jose G. Moreno, Romaric Besançon, Olivier Ferret, Lynda Tamine
In this work, we propose a task-agnostic evaluation method able to evaluate to what extent PLMs can capture complex taxonomy relations, such as ancestors and siblings.
1 code implementation • 8 Dec 2021 • Hanane Djeddal, Thomas Gerald, Laure Soulier, Karen Pinel-Sauvagnat, Lynda Tamine
In this work, our aim is to provide a structured answer in natural language to a complex information need.
1 code implementation • 18 Jan 2021 • Jesus Lovon-Melgarejo, Laure Soulier, Karen Pinel-Sauvagnat, Lynda Tamine
Several deep neural ranking models have been proposed in the recent IR literature.
no code implementations • 15 Jun 2017 • Gia-Hung Nguyen, Laure Soulier, Lynda Tamine, Nathalie Bricon-Souf
The state-of-the-art solutions to the vocabulary mismatch in information retrieval (IR) mainly aim at leveraging either the relational semantics provided by external resources or the distributional semantics, recently investigated by deep neural approaches.
no code implementations • 23 Jun 2016 • Gia-Hung Nguyen, Lynda Tamine, Laure Soulier, Nathalie Bricon-Souf
With this in mind, we argue that embedding KBs within deep neural architectures supporting documentquery matching would give rise to fine-grained latent representations of both words and their semantic relations.