论文标题
关于信息增益,kullback-leibler差异,熵产生和互动内核
On information gain, Kullback-Leibler divergence, entropy production and the involution kernel
论文作者
论文摘要
众所周知,在信息理论和机器中,学习Kullback-Leibler Divergence扩展了Shannon熵的概念,它起着基本作用。给定{\ it先验}概率内核$ \hatν$和可测量空间上的概率$π$,我们认为相对于$ \hatν$的熵的适当定义是基于先前的作品。使用这个熵的概念,我们获得了一般可测量空间的信息的自然定义,该定义与$ \hatν$在$ x $上的概率$ν$中确定了$ \hatν$在$ \hatν$的情况下的共同信息。这将用于将特定信息增益和动力学熵产生的含义扩展到紧凑型字母(TFCA模型)上的符号动力学模型。在这种情况下,我们表明,涉及内核是一种自然工具,可以更好地理解熵产生的某些重要特性。
It is well known that in Information Theory and Machine Learning the Kullback-Leibler divergence, which extends the concept of Shannon entropy, plays a fundamental role. Given an {\it a priori} probability kernel $\hatν$ and a probability $π$ on the measurable space $X\times Y$ we consider an appropriate definition of entropy of $π$ relative to $\hatν$, which is based on previous works. Using this concept of entropy we obtain a natural definition of information gain for general measurable spaces which coincides with the mutual information given from the K-L divergence in the case $\hatν$ is identified with a probability $ν$ on $X$. This will be used to extend the meaning of specific information gain and dynamical entropy production to the model of thermodynamic formalism for symbolic dynamics over a compact alphabet (TFCA model). In this case, we show that the involution kernel is a natural tool for better understanding some important properties of entropy production.