Schneider, Lennart Paul (2025): Advancing hyperparameter optimization: foundations, multiple objectives and algorithmic innovations informed through benchmarking. Dissertation, LMU München: Fakultät für Mathematik, Informatik und Statistik |
Vorschau |
PDF
schneider_lennart_paul.pdf 22MB |
Abstract
Hyperparameter optimization (HPO) is a fundamental aspect of machine learning (ML), directly influencing model performance and adaptability. As a computationally expensive black-box optimization problem, HPO requires efficient algorithms to identify optimal hyperparameter configurations. This thesis advances the field of HPO along three key dimensions: foundational insights, HPO in the presence of more than one objective, and algorithmic innovations through benchmarking. First, we revisit resampling strategies for performance estimation, demonstrating both theoretically and empirically that reshuffling resampling splits across hyperparameter configurations enhances generalization. Additionally, we conduct an in-depth analysis of HPO validation landscapes, revealing characteristics such as low multimodality and broad plateaus that differentiate them from conventional black-box optimization benchmarks. Second, we introduce novel algorithms for HPO in multi-objective and quality diversity settings. We propose a new approach for simultaneously optimizing model performance and interpretability, quantifying interpretability through feature sparsity, sparsity of interaction effects, and sparsity of non-monotone features. Furthermore, we bridge the field of quality diversity optimization with HPO, which allows us to discover diverse yet well-performing neural architectures that satisfy varying hardware constraints within a single optimization run. Third, we use benchmarking to drive algorithmic innovation and insights in HPO. We present YAHPO Gym, a scalable benchmarking suite supporting single-objective, multi-fidelity, and multi-objective HPO via surrogate benchmarks. Using this framework, we define new quality diversity problems inspired by HPO and develop a novel multi-fidelity optimization algorithm guided by programming by optimization principles. Additionally, we ablate a state-of-the-art neural architecture search algorithm to assess the impact of individual components and introduce a systematic approach for constructing synthetic black-box functions that admit specific optimization landscape properties. By deepening our general understanding of HPO, proposing novel multi-objective and quality diversity optimization strategies, and developing scalable benchmarking tools, this thesis enhances the efficiency and effectiveness of HPO across diverse ML applications.
Dokumententyp: | Dissertationen (Dissertation, LMU München) |
---|---|
Keywords: | Hyperparameter Optimization, Black-Box Optimization, Resampling Strategies, Multi-Objective, Quality Diversity, Neural Architecture Search, Benchmarking |
Themengebiete: | 000 Allgemeines, Informatik, Informationswissenschaft
000 Allgemeines, Informatik, Informationswissenschaft > 004 Informatik |
Fakultäten: | Fakultät für Mathematik, Informatik und Statistik |
Sprache der Hochschulschrift: | Englisch |
Datum der mündlichen Prüfung: | 21. Mai 2025 |
1. Berichterstatter:in: | Bischl, Bernd |
MD5 Prüfsumme der PDF-Datei: | 19f68ef70af64ee01d502c6b0466eb43 |
Signatur der gedruckten Ausgabe: | 0001/UMC 31265 |
ID Code: | 35450 |
Eingestellt am: | 20. Jun. 2025 11:17 |
Letzte Änderungen: | 23. Jun. 2025 07:17 |