no code implementations • 29 Jul 2022 • Naima Otberdout, Claudio Ferrari, Mohamed Daoudi, Stefano Berretti, Alberto del Bimbo
We thus propose a new model that generates transitions between different expressions, and synthesizes long and composed 4D expressions.
1 code implementation • 4 Jul 2022 • Baptiste Chopin, Hao Tang, Naima Otberdout, Mohamed Daoudi, Nicu Sebe
To address this limitation, we propose a novel interaction Transformer (InterFormer) consisting of a Transformer network with both temporal and spatial attention.
no code implementations • 1 Mar 2022 • Baptiste Chopin, Naima Otberdout, Mohamed Daoudi, Angela Bartolo
Using such a compact representation avoids error accumulation and provides robust representation for long-term prediction while ensuring the smoothness and the coherence of the whole motion.
no code implementations • 18 May 2021 • Baptiste Chopin, Naima Otberdout, Mohamed Daoudi, Angela Bartolo
Human motion prediction aims to forecast future human poses given a prior pose sequence.
no code implementations • CVPR 2022 • Naima Otberdout, Claudio Ferrari, Mohamed Daoudi, Stefano Berretti, Alberto del Bimbo
This allows us to learn how the motion of a sparse set of landmarks influences the deformation of the overall face surface, independently from the identity.
no code implementations • 23 Jul 2019 • Naima Otberdout, Mohamed Daoudi, Anis Kacem, Lahoucine Ballihi, Stefano Berretti
In this work, we propose a novel approach for generating videos of the six basic facial expressions given a neutral face image.
no code implementations • 25 Oct 2018 • Naima Otberdout, Anis Kacem, Mohamed Daoudi, Lahoucine Ballihi, Stefano Berretti
In this paper, we propose a new approach for facial expression recognition using deep covariance descriptors.
no code implementations • 10 May 2018 • Naima Otberdout, Anis Kacem, Mohamed Daoudi, Lahoucine Ballihi, Stefano Berretti
In this paper, covariance matrices are exploited to encode the deep convolutional neural networks (DCNN) features for facial expression recognition.