91ÅÝܽ

MENU

Dul Norovsambuu

Title: Investigating hyperparameter tuning for popular statistical learning algorithms
Date: October 7th, 2025
Time: 3:00pm
Location: Zoom
Supervised by: Tom Loughin

Abstract:

Tuning hyperparameters for machine learning algorithms is vital for achieving peak performance. While there are many methods for tuning hyperparameters, there are not many studies that extensively compare tuning methods on a vast array of real data sets. This study compares three popular tuning methods---grid search, random search, and sequential model-based optimization---for speed and quality of hyperparameter selection on a large corpus of regression data sets. Furthermore, this study also determines values and ranges that tend to have the best performance for the most influential hyperparameters of random forests, extreme gradient boosting, support vector regression and neural networks. This study also compares the performance of optimally tuned versions of these algorithms to one another. The study was carried out by running these algorithms and tuning methods on 327 datasets and measuring the computing time, chosen hyperparameters, and performance as determined by mean squared prediction error. From this study we found potential starting values for each machine and ranges within which the tuning techniques might best operate. We also found clear differences in the performances of the tuning methods for the four algorithms, with different methods being favoured for different machines.  And we identified settings for some of the tuning methods that should allow them to work efficiently on many datasets.