论文标题

通过捕获当地的不断增长来学习实例特定的增强

Learning Instance-Specific Augmentations by Capturing Local Invariances

论文作者

Miao, Ning, Rainforth, Tom, Mathieu, Emile, Dubois, Yann, Teh, Yee Whye, Foster, Adam, Kim, Hyunjik

论文摘要

我们介绍了Instaaug,这是一种自动从数据中学习特定输入的增强的方法。以前的学习增强方法通常已经假定原始输入与应用于该输入的转换之间的独立性。这可能是高度限制的,因为我们希望我们的增强能够捕获的不断增长通常是高度依赖的。相反,Instaaug引入了一个可学习的不变性模块,该模块从输入到定制的转换参数,允许捕获本地的不变。可以完全端到端的方式与下游模型同时训练,也可以为预训练的模型单独学习。我们从经验上证明,Instaaug学习了各种转型类别的有意义的输入依赖性增强,这反过来又在监督和自我监督任务上提供了更好的性能。

We introduce InstaAug, a method for automatically learning input-specific augmentations from data. Previous methods for learning augmentations have typically assumed independence between the original input and the transformation applied to that input. This can be highly restrictive, as the invariances we hope our augmentation will capture are themselves often highly input dependent. InstaAug instead introduces a learnable invariance module that maps from inputs to tailored transformation parameters, allowing local invariances to be captured. This can be simultaneously trained alongside the downstream model in a fully end-to-end manner, or separately learned for a pre-trained model. We empirically demonstrate that InstaAug learns meaningful input-dependent augmentations for a wide range of transformation classes, which in turn provides better performance on both supervised and self-supervised tasks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源