Paper

DeepChange: A Large Long-Term Person Re-Identification Benchmark with Clothes Change

Existing person re-identification (re-id) works mostly consider short-term application scenarios without clothes change. In real-world, however, we often dress differently across space and time. To solve this contrast, a few recent attempts have been made on long-term re-id with clothes change. Currently, one of the most significant limitations in this field is the lack of a large realistic benchmark. In this work, we contribute a large, realistic long-term person re-identification benchmark, named as DeepChange. It has several unique characteristics: (1) Realistic and rich personal appearance (e.g., clothes and hair style) and variations: Highly diverse clothes change and styles, with varying reappearing gaps in time from minutes to seasons, different weather conditions (e.g., sunny, cloudy, windy, rainy, snowy, extremely cold) and events (e.g., working, leisure, daily activities). (2) Rich camera setups: Raw videos were recorded by 17 outdoor varying resolution cameras operating in a real-world surveillance system. (3) The currently largest number of (17) cameras, (1, 121) identities, and (178, 407) bounding boxes, over the longest time span (12 months). Further, we investigate multimodal fusion strategies for tackling the clothes change challenge. Extensive experiments show that our fusion models outperform a wide variety of state-of-the-art models on DeepChange. Our dataset and documents are available at https://github.com/PengBoXiangShang/deepchange.

Results in Papers With Code
(↓ scroll down to see all results)