Learning Occlusion-Robust Pedestrian Representations via Uncertainty-Guided Feature Pruning

Authors

  • Lukas M. Schneider Department of Informatics, Technical University of Munich (TUM), 85748 Garching, Germany Author
  • Anna K. Vogel Department of Informatics, Technical University of Munich (TUM), 85748 Garching, Germany Author
  • Tobias R. Weber Department of Informatics, Technical University of Munich (TUM), 85748 Garching, Germany Author

DOI:

https://doi.org/10.71465/fapm658

Keywords:

Pedestrian re-identification, occlusion handling, uncertainty-aware attention, autonomous driving, visual perception

Abstract

Occlusion and background interference remain major challenges for pedestrian re-identification in urban traffic environments. Inspired by uncertainty-aware CLIP-based frameworks, this paper introduces an uncertainty-guided feature selection mechanism that adjusts the contribution of local visual regions and semantic cues according to their estimated reliability. The proposed method is evaluated on two autonomous driving datasets with both real-world and synthetic occlusion patterns, covering occlusion ratios from 20% to 60%. Comparisons are conducted against attention-based and part-based ReID methods, including PCB, OSNet, and transformer-based attention models. The proposed approach achieves mAP improvements ranging from 4.5% to 6.2% under severe occlusion conditions, while maintaining comparable performance in fully visible settings.

Downloads

Download data is not yet available.

Downloads

Published

2026-02-24