论文标题
从大型未标记的步行视频中学习步态表示:基准
Learning Gait Representation from Massive Unlabelled Walking Videos: A Benchmark
论文作者
论文摘要
步态描绘了个人独特而区别的步行模式,并已成为人类识别最有希望的生物识别特征之一。作为一项精细的识别任务,步态识别很容易受到许多因素的影响,并且通常需要大量完全注释的数据,这些数据既昂贵又无法满足。本文提出了一个大规模的自我监督基准,以通过对比度学习进行步态识别,旨在通过提供信息丰富的步行先验和各种现实世界中的多样化的差异,从大规模的无标记的步行视频中学习一般步态代表。具体而言,我们收集了一个由1.02m步行序列组成的大规模的步态数据集步态数据集,并提出了一个概念上简单但经验上强大的基线模型步态。在实验上,我们在四个广泛使用的步态基准(Casia-B,Ou-Mvlp,Grow and Grow和Gait3d)上评估了预训练的模型,或者没有转移学习。无监督的结果与基于早期模型和基于GEI的早期方法相当甚至更好。在转移学习后,我们的方法在大多数情况下都超过现有方法的优于现有方法。从理论上讲,我们讨论了步态特异性对比框架的关键问题,并提供了一些进一步研究的见解。据我们所知,Gaitlu-1M是第一个大规模未标记的步态数据集,而步态是第一种在上述基准测试基准上取得了显着无监督结果的方法。步态的源代码将集成到OpenGait中,可在https://github.com/shiqiyu/opengait上获得。
Gait depicts individuals' unique and distinguishing walking patterns and has become one of the most promising biometric features for human identification. As a fine-grained recognition task, gait recognition is easily affected by many factors and usually requires a large amount of completely annotated data that is costly and insatiable. This paper proposes a large-scale self-supervised benchmark for gait recognition with contrastive learning, aiming to learn the general gait representation from massive unlabelled walking videos for practical applications via offering informative walking priors and diverse real-world variations. Specifically, we collect a large-scale unlabelled gait dataset GaitLU-1M consisting of 1.02M walking sequences and propose a conceptually simple yet empirically powerful baseline model GaitSSB. Experimentally, we evaluate the pre-trained model on four widely-used gait benchmarks, CASIA-B, OU-MVLP, GREW and Gait3D with or without transfer learning. The unsupervised results are comparable to or even better than the early model-based and GEI-based methods. After transfer learning, our method outperforms existing methods by a large margin in most cases. Theoretically, we discuss the critical issues for gait-specific contrastive framework and present some insights for further study. As far as we know, GaitLU-1M is the first large-scale unlabelled gait dataset, and GaitSSB is the first method that achieves remarkable unsupervised results on the aforementioned benchmarks. The source code of GaitSSB will be integrated into OpenGait which is available at https://github.com/ShiqiYu/OpenGait.