论文标题
深度度量学习的对称合成
Symmetrical Synthesis for Deep Metric Learning
论文作者
论文摘要
深度度量学习旨在学习包含数据点语义相似性信息的嵌入。为了学习更好的嵌入,已经提出了生成合成硬样品的方法。现有的合成硬样品生成的方法是采用自动编码器或生成对抗网络,但这会导致更多的超参数,更难优化和较慢的训练速度。在本文中,我们通过提出一种新型的合成硬样品产生方法来解决这些问题,称为对称合成。给定同一类的两个原始特征点,提出的方法首先将综合点彼此作为对称轴。其次,它在原始和合成点内执行硬性负相挖掘,以选择一个更有信息的负面对计算度量学习损失。我们提出的方法是无参数和插件,用于无需修改网络的现有度量学习损失。我们证明了我们提出的方法比现有方法的优越性,用于在聚类和图像检索任务上的各种损失函数。我们的实施已公开可用。
Deep metric learning aims to learn embeddings that contain semantic similarity information among data points. To learn better embeddings, methods to generate synthetic hard samples have been proposed. Existing methods of synthetic hard sample generation are adopting autoencoders or generative adversarial networks, but this leads to more hyper-parameters, harder optimization, and slower training speed. In this paper, we address these problems by proposing a novel method of synthetic hard sample generation called symmetrical synthesis. Given two original feature points from the same class, the proposed method firstly generates synthetic points with each other as an axis of symmetry. Secondly, it performs hard negative pair mining within the original and synthetic points to select a more informative negative pair for computing the metric learning loss. Our proposed method is hyper-parameter free and plug-and-play for existing metric learning losses without network modification. We demonstrate the superiority of our proposed method over existing methods for a variety of loss functions on clustering and image retrieval tasks. Our implementations is publicly available.