HYPERPARAMETER OPTIMIZATION FOR RANDOM FOREST AND K-NN IN HIGH-DIMENSIONAL REGRESSION PROBLEMS

Authors

  • Muhammad Hamraz Author
  • Naz Gul Author
  • Nosheen Faiz Author
  • Soofia Iftikhar Author

DOI:

https://doi.org/10.63075/42xkb331

Keywords:

Regression, hyperparameters, k-NN, Random Forest, High-dimensional data

Abstract

Machine learning algorithms have been widely used in various application areas. To effectively apply a machine learning model to different problems, its hyperparameters must be properly tuned. Selecting an optimal hyperparameter configuration has a direct impact on a model’s performance and often requires in-depth knowledge of machine learning algorithms as well as suitable hyperparameter optimization techniques. Although several automatic optimization methods are available, their effectiveness varies depending on the type of problem being addressed. In this study, the optimization of hyperparameters for commonly used machine learning models is investigated. The models considered include the k-nearest neighbors (k-NN) and Random Forest algorithms. For the analysis, two benchmark datasets—Boston and Longley—are used. Each dataset is divided into 80% training and 20% testing subsets. The mean squared error (MSE) is employed as the performance metric to identify the optimal set of hyperparameters for both models. Additionally, boxplots are constructed to visualize and compare the performance of different hyperparameter configurations.

Downloads

Published

2025-06-30

How to Cite

HYPERPARAMETER OPTIMIZATION FOR RANDOM FOREST AND K-NN IN HIGH-DIMENSIONAL REGRESSION PROBLEMS. (2025). Review Journal of Neurological & Medical Sciences Review, 3(2), 433-441. https://doi.org/10.63075/42xkb331