Bayesian Optimization Lower Bounds

Investigator: Jonathan Scarlett.

Black-box function optimization via expensive noisy samples have diverse applications in problems such as hyperparameter tuning, robotics, and molecular design. Various algorithms have previously been proposed with rigorous performance bounds guaranteeing a near-optimal solution. We provided the first algorithm-independent lower bounds on performance, using tools from information theory. This allows certification of the degree of optimality of existing algorithms, and identification of the cases in which the greatest improvements remain possible.

Reference: Jonathan Scarlett, “Tight Regret Bounds for Bayesian Optimization in One Dimension,” International Conference on Machine Learning (ICML), 2018

Privacy Settings
We use cookies to enhance your experience while using our website. If you are using our Services via a browser you can restrict, block or remove cookies through your web browser settings. We also use content and scripts from third parties that may use tracking technologies. You can selectively provide your consent below to allow such third party embeds. For complete information about the cookies we use, data we collect and how we process them, please check our Privacy Policy
Consent to display content from Youtube
Consent to display content from Vimeo
Google Maps
Consent to display content from Google