论文标题
朝着解释分配变化
Towards Explaining Distribution Shifts
论文作者
论文摘要
分配变化可能会带来根本的后果,例如信号发生操作环境的变化或显着降低下游模型的准确性。因此,理解分布变化对于检查和希望减轻这种转变的影响至关重要。大多数先前的工作都侧重于仅检测是否发生了转变,并假设人类操作员可以适当地理解和处理任何检测到的转移。我们希望通过使用可解释的运输图从原始分布到转移的分配来解释分配变化来帮助这些手动缓解任务。我们从放松的最佳运输中得出了可解释的映射,其中候选映射仅限于一组可解释的映射。然后,我们检查现实世界表,文本和图像数据集中的多个典型的分布变化用例,以展示我们的解释映射如何在细节和可解释性之间与通过视觉检查和我们的百分比分解度量相比,在细节和可解释性之间取得更好的平衡。
A distribution shift can have fundamental consequences such as signaling a change in the operating environment or significantly reducing the accuracy of downstream models. Thus, understanding distribution shifts is critical for examining and hopefully mitigating the effect of such a shift. Most prior work focuses on merely detecting if a shift has occurred and assumes any detected shift can be understood and handled appropriately by a human operator. We hope to aid in these manual mitigation tasks by explaining the distribution shift using interpretable transportation maps from the original distribution to the shifted one. We derive our interpretable mappings from a relaxation of optimal transport, where the candidate mappings are restricted to a set of interpretable mappings. We then inspect multiple quintessential use-cases of distribution shift in real-world tabular, text, and image datasets to showcase how our explanatory mappings provide a better balance between detail and interpretability than baseline explanations by both visual inspection and our PercentExplained metric.