论文标题
化学主方程的模型降低:信息理论方法
Model Reduction for the Chemical Master Equation: an Information-Theoretic Approach
论文作者
论文摘要
生物学中数学模型的复杂性使模型的模型成为定量生物学家工具包中的重要工具。对于使用化学主方程描述的随机反应网络,常用的方法包括时间尺度的分离,线性映射近似和状态空间隆起。尽管这些技术取得了成功,但它们似乎是截然不同的,目前尚无通用方法来模型减少随机反应网络的模型。在本文中,我们表明,化学主方程的最常见模型还原方法可以看作是最大程度地减少了众所周知的信息理论数量及其还原之间,即在轨迹空间上定义的kullback-leibler差异。这使我们能够将降低模型的任务重新制定为可以使用标准数值优化方法解决的变异问题。此外,我们还得出了使用经典方法概括的系统的倾向的一般表达。我们表明,使用文献中的三个示例:自动调节反馈回路,Michaelis-Menteren酶系统和遗传振荡器,使用三个示例来评估模型差异并比较不同模型还原技术的有用指标。
The complexity of mathematical models in biology has rendered model reduction an essential tool in the quantitative biologist's toolkit. For stochastic reaction networks described using the Chemical Master Equation, commonly used methods include time-scale separation, the Linear Mapping Approximation and state-space lumping. Despite the success of these techniques, they appear to be rather disparate and at present no general-purpose approach to model reduction for stochastic reaction networks is known. In this paper we show that most common model reduction approaches for the Chemical Master Equation can be seen as minimising a well-known information-theoretic quantity between the full model and its reduction, the Kullback-Leibler divergence defined on the space of trajectories. This allows us to recast the task of model reduction as a variational problem that can be tackled using standard numerical optimisation approaches. In addition we derive general expressions for the propensities of a reduced system that generalise those found using classical methods. We show that the Kullback-Leibler divergence is a useful metric to assess model discrepancy and to compare different model reduction techniques using three examples from the literature: an autoregulatory feedback loop, the Michaelis-Menten enzyme system and a genetic oscillator.