高级检索
陈巧媛, 陈莹. 通道互注意机制下的部位对齐行人再识别[J]. 计算机辅助设计与图形学学报, 2020, 32(8): 1258-1266. DOI: 10.3724/SP.J.1089.2020.18043
引用本文: 陈巧媛, 陈莹. 通道互注意机制下的部位对齐行人再识别[J]. 计算机辅助设计与图形学学报, 2020, 32(8): 1258-1266. DOI: 10.3724/SP.J.1089.2020.18043
Chen Qiaoyuan, Chen Ying. Correlation Channel-Wise Based Part Aligned Representations for Person Re-identification[J]. Journal of Computer-Aided Design & Computer Graphics, 2020, 32(8): 1258-1266. DOI: 10.3724/SP.J.1089.2020.18043
Citation: Chen Qiaoyuan, Chen Ying. Correlation Channel-Wise Based Part Aligned Representations for Person Re-identification[J]. Journal of Computer-Aided Design & Computer Graphics, 2020, 32(8): 1258-1266. DOI: 10.3724/SP.J.1089.2020.18043

通道互注意机制下的部位对齐行人再识别

Correlation Channel-Wise Based Part Aligned Representations for Person Re-identification

  • 摘要: 由于视角和行人姿态的变化、遮挡以及非手工行人框的误差等因素,同一行人的不同图像差异较大,给行人再识别课题的研究带来了极大挑战.为提高行人姿态变化下的行人再识别性能,提出通道互注意机制下的部位对齐行人再识别网络.首先,行人图像通过2个子网络,分别提取行人的外观特征和部位特征;然后设计一个通道互注意模块,通过挖掘行人部位特征通道间的互相关系,优化部位特征在通道维度上的权重;最后,将优化后的行人部位特征和外观特征通过双线性池化进行特征融合.在3个大规模公开数据集上的实验结果表明,通道互注意机制能有效优化部位特征,所设计的部位对齐网络具有抗姿态变化和背景干扰的能力.在Market-1501,DukeMTMC-reID和CUHK03数据集上Rank-1/mAP分别达到93.9%/90.6%,87.6%/83.3%和70.4%/72.8%,优于其他现有方法.

     

    Abstract: Images of a same person have extreme variation due to viewpoint,pose,occlusion and detection error,which are major challenges in person re-identification.For the purpose of improving the accuracy of person re-identification under person’s pose changing,a correlation channel-wise based part aligned representations(CCPAR)for person re-identification is proposed.Firstly,person images are input into two sub-networks to obtain person appearance and part features respectively.Secondly,a correlation channel-wise module(CCM)is de-signed for optimizing the channels weights of part features.The CCM mines the correlation between channels of part features.Finally,appearance features and optimized part features are fused by bilinear pooling.Experiments on three large scale datasets show that the CCM can enhance part features.On Market-1501,DukeMTMC-reID and CUHK03 datasets,Rank-1/mAP of CCPAR reaches 93.9%/90.6%,87.6%/83.3%and 70.4%/72.8%,which is superior to other existing methods.

     

/

返回文章
返回