论文标题
Pyhopper-高参数优化
PyHopper -- Hyperparameter optimization
论文作者
论文摘要
高参数调整是机器学习研究的一个基本方面。设置用于系统优化超参数的基础架构可能需要大量时间。在这里,我们提出了Pyhopper,这是一个黑盒优化平台,旨在简化机器学习研究人员的高参数调整工作流程。 Pyhopper的目标是与现有代码一起以最小的努力集成,并以最少的必要手动监督运行优化过程。 Pyhopper以简单性为主要主题,由单个强大的马尔可夫链蒙特卡洛优化算法提供动力,该算法扩展到数百万个维度。与现有的调谐软件包相比,专注于单个算法使用户不必在几种算法之间做出决定,并使Pyhopper易于自定义。 Pyhopper可在https://github.com/pyhopper/pyhopper的Apache-2.0许可下公开获得。
Hyperparameter tuning is a fundamental aspect of machine learning research. Setting up the infrastructure for systematic optimization of hyperparameters can take a significant amount of time. Here, we present PyHopper, a black-box optimization platform designed to streamline the hyperparameter tuning workflow of machine learning researchers. PyHopper's goal is to integrate with existing code with minimal effort and run the optimization process with minimal necessary manual oversight. With simplicity as the primary theme, PyHopper is powered by a single robust Markov-chain Monte-Carlo optimization algorithm that scales to millions of dimensions. Compared to existing tuning packages, focusing on a single algorithm frees the user from having to decide between several algorithms and makes PyHopper easily customizable. PyHopper is publicly available under the Apache-2.0 license at https://github.com/PyHopper/PyHopper.