论文标题

端到端上下文辅助匹配的人重新识别

End-to-End Context-Aided Unicity Matching for Person Re-identification

论文作者

Cao, Min, Ding, Cong, Chen, Chen, Yan, Junchi, Tian, Qi

论文摘要

大多数现有的人重新识别方法根据成对相似性的排名来计算跨相机视图之间的人图像之间的匹配关系。这种匹配的策略缺乏全球观点,并且上下文的考虑不可避免地会导致模棱两可的匹配结果和次优的性能。基于自然假设,即属于同一人身份的图像不应与跨视图属于多个不同人身份的图像匹配,称为身份级别匹配的人的独立性,我们建议端到端的人统一性匹配学习和完善人匹配关系的架构。首先,我们在特征空间中采用图像示例的上下文信息,以使用图形神经网络来生成初始软匹配结果。其次,我们利用样本的全局上下文关系来完善软匹配结果,并通过二分球图匹配达到匹配的单调。考虑到现实世界中的人重新识别应用程序,我们在人重新识别的一杆和多弹奏设置中都实现了匹配,并进一步开发了unicity匹配的快速版本而不会失去性能。提出的方法在五个公共基准测试中进行了评估,包括四个多弹药数据集MSMT17,DUKEMTMC,Market1501,Cuhk03和一个单发数据集Viper。实验结果表明,提出的方法对性能和效率的优越性。

Most existing person re-identification methods compute the matching relations between person images across camera views based on the ranking of the pairwise similarities. This matching strategy with the lack of the global viewpoint and the context's consideration inevitably leads to ambiguous matching results and sub-optimal performance. Based on a natural assumption that images belonging to the same person identity should not match with images belonging to multiple different person identities across views, called the unicity of person matching on the identity level, we propose an end-to-end person unicity matching architecture for learning and refining the person matching relations. First, we adopt the image samples' contextual information in feature space to generate the initial soft matching results by using graph neural networks. Secondly, we utilize the samples' global context relationship to refine the soft matching results and reach the matching unicity through bipartite graph matching. Given full consideration to real-world person re-identification applications, we achieve the unicity matching in both one-shot and multi-shot settings of person re-identification and further develop a fast version of the unicity matching without losing the performance. The proposed method is evaluated on five public benchmarks, including four multi-shot datasets MSMT17, DukeMTMC, Market1501, CUHK03, and a one-shot dataset VIPeR. Experimental results show the superiority of the proposed method on performance and efficiency.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源