论文标题
基于视频的人重新识别的流动引导相互注意网络
A Flow-Guided Mutual Attention Network for Video-Based Person Re-Identification
论文作者
论文摘要
在许多视频分析和监视应用程序中,人重新识别(REID)是一个具有挑战性的问题,在许多视频分析和监视应用程序中,一个人的身份必须在分布式的非重叠摄像机网络中关联。基于视频的人Reid最近引起了人们的兴趣,因为它允许从视频片段中捕获判别时空信息,而这些信息对于基于图像的Reid而言无法使用。尽管最近进步,视频REID的深度学习模型(DL)模型通常无法利用此信息来提高功能表示的鲁棒性。在本文中,将一个人的运动模式探索为REID的附加提示。特别是,提出了一个使用任何2D-CNN主链融合图像和光流序列的流动相互注意网络,从而可以编码时间信息以及空间外观信息。我们的相互注意力网络依赖于图像和光流图之间的关节空间注意力,以激活它们中的一组突出特征。除了引导引导外,我们还引入了一种方法,可以从更长的输入流中汇总特征,以获得更好的视频序列级表示。我们对三个具有挑战性的视频REID数据集进行的广泛实验表明,使用拟议的相互关注网络可以大大提高识别准确性,相对于传统的封闭式注意力网络以及基于视频的人REID的最新方法。
Person Re-Identification (ReID) is a challenging problem in many video analytics and surveillance applications, where a person's identity must be associated across a distributed non-overlapping network of cameras. Video-based person ReID has recently gained much interest because it allows capturing discriminant spatio-temporal information from video clips that is unavailable for image-based ReID. Despite recent advances, deep learning (DL) models for video ReID often fail to leverage this information to improve the robustness of feature representations. In this paper, the motion pattern of a person is explored as an additional cue for ReID. In particular, a flow-guided Mutual Attention network is proposed for fusion of image and optical flow sequences using any 2D-CNN backbone, allowing to encode temporal information along with spatial appearance information. Our Mutual Attention network relies on the joint spatial attention between image and optical flow features maps to activate a common set of salient features across them. In addition to flow-guided attention, we introduce a method to aggregate features from longer input streams for better video sequence-level representation. Our extensive experiments on three challenging video ReID datasets indicate that using the proposed Mutual Attention network allows to improve recognition accuracy considerably with respect to conventional gated-attention networks, and state-of-the-art methods for video-based person ReID.