Continual Semantic Segmentation via Repulsion-Attraction of Sparse and Disentangled Latent Representations

CVPR 2021  ·  Umberto Michieli, Pietro Zanuttigh ·

Deep neural networks suffer from the major limitation of catastrophic forgetting old tasks when learning new ones. In this paper we focus on class incremental continual learning in semantic segmentation, where new categories are made available over time while previous training data is not retained. The proposed continual learning scheme shapes the latent space to reduce forgetting whilst improving the recognition of novel classes. Our framework is driven by three novel components which we also combine on top of existing techniques effortlessly. First, prototypes matching enforces latent space consistency on old classes, constraining the encoder to produce similar latent representation for previously seen classes in the subsequent steps. Second, features sparsification allows to make room in the latent space to accommodate novel classes. Finally, contrastive learning is employed to cluster features according to their semantics while tearing apart those of different classes. Extensive evaluation on the Pascal VOC2012 and ADE20K datasets demonstrates the effectiveness of our approach, significantly outperforming state-of-the-art methods.

PDF Abstract CVPR 2021 PDF CVPR 2021 Abstract

Datasets


Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Disjoint 15-5 PASCAL VOC 2012 SDR Mean IoU 67.3 # 3
Overlapped 10-1 PASCAL VOC 2012 SDR mIoU 25.1 # 9
Disjoint 10-1 PASCAL VOC 2012 SDR mIoU 14.3 # 4
Disjoint 15-1 PASCAL VOC 2012 SDR mIoU 48.7 # 4
Overlapped 15-5 PASCAL VOC 2012 SDR Mean IoU (val) 70.1 # 7
Overlapped 15-1 PASCAL VOC 2012 SDR mIoU 39.5 # 9

Methods