论文标题

成本感知的贝叶斯优化

Cost-aware Bayesian Optimization

论文作者

Lee, Eric Hans, Perrone, Valerio, Archambeau, Cedric, Seeger, Matthias

论文摘要

贝叶斯优化(BO)是一类全局优化算法,适合在尽可能少的功能评估中最大程度地减少昂贵的目标函数。虽然BO预算通常在迭代中给出,但这种隐含的衡量收敛于迭代数量,并假定每个评估的成本相同。实际上,搜索空间的不同区域的评估成本可能会有所不同。例如,神经网络训练的成本随着层大小四次增加,这是典型的超参数。成本感知的BO衡量与替代成本指标(例如时间,能源或金钱)的融合,不适合香草Bo方法。我们介绍了成本分配的BO(CARBO),该成本试图以尽可能少的成本最小化目标功能。 Carbo将一种经济高效的初始设计与成本冷却的优化阶段相结合,随着迭代的进行,学习的成本模型贬值。在一组20个黑框功能优化问题上,我们表明,鉴于相同的成本预算,Carbo发现的超参数配置要比竞争方法要好得多。

Bayesian optimization (BO) is a class of global optimization algorithms, suitable for minimizing an expensive objective function in as few function evaluations as possible. While BO budgets are typically given in iterations, this implicitly measures convergence in terms of iteration count and assumes each evaluation has identical cost. In practice, evaluation costs may vary in different regions of the search space. For example, the cost of neural network training increases quadratically with layer size, which is a typical hyperparameter. Cost-aware BO measures convergence with alternative cost metrics such as time, energy, or money, for which vanilla BO methods are unsuited. We introduce Cost Apportioned BO (CArBO), which attempts to minimize an objective function in as little cost as possible. CArBO combines a cost-effective initial design with a cost-cooled optimization phase which depreciates a learned cost model as iterations proceed. On a set of 20 black-box function optimization problems we show that, given the same cost budget, CArBO finds significantly better hyperparameter configurations than competing methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源