Main Article Content
Streamlining the Path from Data to Deployment: Intelligent Methods for Hyperparameter Tuning in Machine Learning
Abstract
This study addresses the essential role of hyperparameter optimization in intricate machine learning models, particularly in image classification tasks. With manual tuning impractical intheface of escalating complexities, the research thoroughly evaluates eight automated optimization methods: grid search, random search, Gaussian process Bayesian optimization (BO), Tree Parzenestimator BO, Hyperband, BO/Hyperband hybrid, genetic algorithms, and particle swarmoptimization. Assessments cover diverse model architectures and performance metrics, considering accuracy, mean squared error, and optimization time. Grid search proves exhaustive but time-prohibitive, random search is sensitive to seed values, Gaussian process BO excels in low-dimensional spaces, and Tree Parzen estimator BO is efficient in higher dimensions. Hyperband prioritizes time efficiency, genetic algorithms pose parallelization challenges, andparticle swarm optimization excels with optimal accuracy and efficiency. Distinct advantages emerge based on model architecture and search space complexity, highlighting the needfortailored optimizers in specific machine learning applications. Comprehensive benchmarks provide valuable guidance, with future work recommended to extend evaluations to emergingmodel classes, particularly deep neural networks.