论文标题
通过JSD上限的合作分配对齐
Cooperative Distribution Alignment via JSD Upper Bound
论文作者
论文摘要
无监督的分布对准估计了一个转换,该转换将两个或多个源分布映射到只有从每个分布中的样品的共享对齐分布。该任务具有许多应用程序,包括生成建模,无监督的领域适应和社会意识学习。大多数先前的作品都使用对抗性学习(即,最小最大优化),这可能是具有挑战性的优化和评估。最近的一些著作探讨了基于流动的非对抗性(即可逆)方法,但它们缺乏统一的观点,并且在有效地对齐多个分布方面受到限制。因此,我们建议在单个非对抗框架下统一和推广基于流动的基于流动的方法,我们证明这相当于最大程度地减少詹森 - 香农脱落(JSD)上的上限。重要的是,我们的问题减少到最小值,即合作,问题,可以为无监督分布对准提供自然评估指标。我们在模拟和现实世界数据集上显示了经验结果,以证明我们方法的好处。代码可在https://github.com/inouye-lab/alignment-upper-bound上找到。
Unsupervised distribution alignment estimates a transformation that maps two or more source distributions to a shared aligned distribution given only samples from each distribution. This task has many applications including generative modeling, unsupervised domain adaptation, and socially aware learning. Most prior works use adversarial learning (i.e., min-max optimization), which can be challenging to optimize and evaluate. A few recent works explore non-adversarial flow-based (i.e., invertible) approaches, but they lack a unified perspective and are limited in efficiently aligning multiple distributions. Therefore, we propose to unify and generalize previous flow-based approaches under a single non-adversarial framework, which we prove is equivalent to minimizing an upper bound on the Jensen-Shannon Divergence (JSD). Importantly, our problem reduces to a min-min, i.e., cooperative, problem and can provide a natural evaluation metric for unsupervised distribution alignment. We show empirical results on both simulated and real-world datasets to demonstrate the benefits of our approach. Code is available at https://github.com/inouye-lab/alignment-upper-bound.