论文标题

是贝叶斯人对分类概率

Being Bayesian about Categorical Probability

论文作者

Joo, Taejong, Chung, Uijung, Seo, Min-Gwan

论文摘要

神经网络利用SoftMax作为分类任务中的构件,其中包含过度自信的问题,并且缺乏不确定性表示能力。作为SoftMax的贝叶斯替代品,我们考虑了类标签上分类概率的随机变量。在此框架中,先前的分布明确对观察到的标签固有的假定噪声进行了建模,该标签在多个挑战性任务中提供了一致的概括性能。所提出的方法继承了贝叶斯方法的优势,以实现更好的不确定性估计和模型校准。与SoftMax相比,我们的方法可以作为插件损耗函数实现,并具有跨透明损失函数。

Neural networks utilize the softmax as a building block in classification tasks, which contains an overconfidence problem and lacks an uncertainty representation ability. As a Bayesian alternative to the softmax, we consider a random variable of a categorical probability over class labels. In this framework, the prior distribution explicitly models the presumed noise inherent in the observed label, which provides consistent gains in generalization performance in multiple challenging tasks. The proposed method inherits advantages of Bayesian approaches that achieve better uncertainty estimation and model calibration. Our method can be implemented as a plug-and-play loss function with negligible computational overhead compared to the softmax with the cross-entropy loss function.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源