论文标题

使用高维贝叶斯优化的参数优化

Parameter Optimization using high-dimensional Bayesian Optimization

论文作者

Yenicelik, David

论文摘要

在本文中,我探讨了在高维域中进行贝叶斯优化技术的可能性。尽管可以将高维域的定义定义为数百到数千个维度,但我们将主要集中在两个和20个维度之间发生的问题设置上。因此,我们专注于解决实际问题的解决方案,例如对电子加速器的参数调整参数,或者用于更简单的任务,这些任务可以通过手头的标准笔记本电脑及时运行和优化。我们的主要贡献是1。)比较对数类样投影矩阵中的角度差异以及发现的矩阵矩阵的角度差异。2。)对当前流行方法(包括优势和缺点)进行了广泛的分析,3。)对降低的降低技术可用于特征选择的简短分析。失败,以及考虑“被动”子空间,这些子空间提供了手头函数的小扰动。无聊的主要特征是1.)识别子空间的可能性(与大多数其他优化算法不同),以及2.)如果识别失败,则提供较低的惩罚以识别子空间,因为优化仍然是主要目标。

In this thesis, I explore the possibilities of conducting Bayesian optimization techniques in high dimensional domains. Although high dimensional domains can be defined to be between hundreds and thousands of dimensions, we will primarily focus on problem settings that occur between two and 20 dimensions. As such, we focus on solutions to practical problems, such as tuning the parameters for an electron accelerator, or for even simpler tasks that can be run and optimized just in time with a standard laptop at hand. Our main contributions are 1.) comparing how the log-likelihood affects the angle-difference in the real projection matrix, and the found matrix matrix, 2.) an extensive analysis of current popular methods including strengths and shortcomings, 3.) a short analysis on how dimensionality reduction techniques can be used for feature selection, and 4.) a novel algorithm called "BORING", which allows for a simple fallback mechanism if the matrix identification fails, as well as taking into consideration "passive" subspaces which provide small perturbations of the function at hand. The main features of BORING are 1.) the possibility to identify the subspace (unlike most other optimization algorithms), and 2.) to provide a much lower penalty to identify the subspace if identification fails, as optimization is still the primary goal.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源