Member-only story

Exploring the Optimal Hyperparameters for XGBoost

Strategies for Tuning Hyperparameters in XGBoost to Achieve Maximum Performance

Renu Khandelwal
10 min readApr 10, 2023

In this post, you will explore the most popular hyperparameters for XGBoost and code implementation using HyperOpt.

Know more about the XGBoost algorithm

image by the author

What is a hyperparameter?

Machine Learning models have two types of parameters

  1. Model Parameters: Initialized and updated during the training process as the machine learning algorithm learns the data patterns, like the weights of the neurons in neural networks
  2. Hyperparameters: Set before the training process starts for a machine learning algorithm, these parameters are configured for an ML model to minimize the loss function. They are integral in building the structure of the model, examples of hyperparameters include learning rate, batch size, optimizer, and more.

Hyperparameters are parameters set before the training of a machine learning model, which controls the machine learning model’s behavior and is used to optimize its performance of the model by reducing the loss.

--

--

Renu Khandelwal
Renu Khandelwal

Written by Renu Khandelwal

A Technology Enthusiast who constantly seeks out new challenges by exploring cutting-edge technologies to make the world a better place!

No responses yet