Skip to yearly menu bar Skip to main content


Oral

Improving Model Training with Multi-fidelity Hyperparameter Evaluation

Yimin Huang · Yujun Li · Hanrong Ye · Zhenguo Li · Zhihua Zhang

Exhibit Hall A
[ ] [ Livestream: Visit Efficient Training ]

Abstract:

The evaluation of hyperparameters, neural architectures, or data augmentation policies becomes a critical problem in advanced deep model training with a large hyperparameter search space. In this paper, we propose an efficient and robust bandit-based algorithm called Sub-Sampling (SS) in the scenario of hyperparameter search evaluation and its modified version for high GPU usage. It evaluates the potential of hyperparameters by the sub-samples of observations and is theoretically proved to be optimal under the criterion of cumulative regret. We further combine SS with Bayesian Optimization and develop a novel hyperparameter optimization algorithm called BOSS. Empirical studies validate our theoretical arguments of SS and demonstrate the superior performance of BOSS on a number of applications, including Neural Architecture Search (NAS), Data Augmentation (DA), Object Detection (OD), and Reinforcement Learning (RL).

Chat is not available.