MUMBO: MUlti-task Max-Value Bayesian Optimization
نویسندگان
چکیده
We propose MUMBO, the first high-performing yet computationally efficient acquisition function for multi-task Bayesian optimization. Here, challenge is to perform optimization by evaluating low-cost functions somehow related our true target function. This a broad class of problems including popular task multi-fidelity However, while information-theoretic are known provide state-of-the-art optimization, existing implementations scenarios have prohibitive computational requirements. Previous therefore been suitable only with both low-dimensional parameter spaces and query costs sufficiently large overshadow very significant overheads. In this work, we derive novel version entropy search, delivering robust performance low overheads across classic challenges hyper-parameter tuning. MUMBO scalable efficient, allowing be deployed in rich fidelity spaces.
منابع مشابه
Multi-Task Bayesian Optimization
Bayesian optimization has recently been proposed as a framework for automatically tuning the hyperparameters of machine learning models and has been shown to yield state-of-the-art performance with impressive ease and efficiency. In this paper, we explore whether it is possible to transfer the knowledge gained from previous optimizations to new tasks in order to find optimal hyperparameter sett...
متن کاملMax-value Entropy Search for Efficient Bayesian Optimization (Appendix)
Our work is also closely related to probability of improvement (PI) (Kushner, 1964), expected improvement (EI) (Moc̆kus, 1974), and the BO algorithms using upper confidence bound to direct the search (Auer, 2002; Kawaguchi et al., 2015; 2016), such as GP-UCB (Srinivas et al., 2010). In (Wang et al., 2016), it was pointed out that GP-UCB and PI are closely related by exchanging the parameters. In...
متن کاملMax-value Entropy Search for Efficient Bayesian Optimization
Entropy Search (ES) and Predictive Entropy Search (PES) are popular and empirically successful Bayesian Optimization techniques. Both rely on a compelling information-theoretic motivation, and maximize the information gained about the arg max of the unknown function; yet, both are plagued by the expensive computation for estimating entropies. We propose a new criterion, Max-value Entropy Search...
متن کاملNonparametric Bayesian Multi-task Learning with Max-margin Posterior Regularization
Learning a common latent representation can capture the relationships and share statistic strength among multiple tasks. To automatically resolve the unknown dimensionality of the latent representation, nonparametric Bayesian methods have been successfully developed with a generative process describing the observed data. In this paper, we present a discriminative approach to learning nonparamet...
متن کاملBayesian Max-margin Multi-Task Learning with Data Augmentation
Both max-margin and Bayesian methods have been extensively studied in multi-task learning, but have rarely been considered together. We present Bayesian max-margin multi-task learning, which conjoins the two schools of methods, thus allowing the discriminative max-margin methods to enjoy the great flexibility of Bayesian methods on incorporating rich prior information as well as performing nonp...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Lecture Notes in Computer Science
سال: 2021
ISSN: ['1611-3349', '0302-9743']
DOI: https://doi.org/10.1007/978-3-030-67664-3_27