Simple Deterministic Selection-Based Genetic Algorithm for Hyperparameter Tuning of Machine Learning Models


Journal article


Ismail Damilola Raji, H. Bello-Salau, I. J. Umoh, A. Onumanyi, M. Adegboye, Ahmed Tijani Salawudeen
Applied Sciences, 2022

Semantic Scholar DOI
Cite

Cite

APA   Click to copy
Raji, I. D., Bello-Salau, H., Umoh, I. J., Onumanyi, A., Adegboye, M., & Salawudeen, A. T. (2022). Simple Deterministic Selection-Based Genetic Algorithm for Hyperparameter Tuning of Machine Learning Models. Applied Sciences.


Chicago/Turabian   Click to copy
Raji, Ismail Damilola, H. Bello-Salau, I. J. Umoh, A. Onumanyi, M. Adegboye, and Ahmed Tijani Salawudeen. “Simple Deterministic Selection-Based Genetic Algorithm for Hyperparameter Tuning of Machine Learning Models.” Applied Sciences (2022).


MLA   Click to copy
Raji, Ismail Damilola, et al. “Simple Deterministic Selection-Based Genetic Algorithm for Hyperparameter Tuning of Machine Learning Models.” Applied Sciences, 2022.


BibTeX   Click to copy

@article{ismail2022a,
  title = {Simple Deterministic Selection-Based Genetic Algorithm for Hyperparameter Tuning of Machine Learning Models},
  year = {2022},
  journal = {Applied Sciences},
  author = {Raji, Ismail Damilola and Bello-Salau, H. and Umoh, I. J. and Onumanyi, A. and Adegboye, M. and Salawudeen, Ahmed Tijani}
}

Abstract

Hyperparameter tuning is a critical function necessary for the effective deployment of most machine learning (ML) algorithms. It is used to find the optimal hyperparameter settings of an ML algorithm in order to improve its overall output performance. To this effect, several optimization strategies have been studied for fine-tuning the hyperparameters of many ML algorithms, especially in the absence of model-specific information. However, because most ML training procedures need a significant amount of computational time and memory, it is frequently necessary to build an optimization technique that converges within a small number of fitness evaluations. As a result, a simple deterministic selection genetic algorithm (SDSGA) is proposed in this article. The SDSGA was realized by ensuring that both chromosomes and their accompanying fitness values in the original genetic algorithm are selected in an elitist-like way. We assessed the SDSGA over a variety of mathematical test functions. It was then used to optimize the hyperparameters of two well-known machine learning models, namely, the convolutional neural network (CNN) and the random forest (RF) algorithm, with application on the MNIST and UCI classification datasets. The SDSGA’s efficiency was compared to that of the Bayesian Optimization (BO) and three other popular metaheuristic optimization algorithms (MOAs), namely, the genetic algorithm (GA), particle swarm optimization (PSO) and biogeography-based optimization (BBO) algorithms. The results obtained reveal that the SDSGA performed better than the other MOAs in solving 11 of the 17 known benchmark functions considered in our study. While optimizing the hyperparameters of the two ML models, it performed marginally better in terms of accuracy than the other methods while taking less time to compute.


Share



Follow this website


You need to create an Owlstown account to follow this website.


Sign up

Already an Owlstown member?

Log in