-
-
Notifications
You must be signed in to change notification settings - Fork 8.9k
Revisiting default parameter settings? #4986
Copy link
Copy link
Open
Labels
Description
Hello all,
I came upon a recent JMLR paper that examined the "tunability" of the hyperparameters of multiple algorithms, including XGBoost.
Their methodology, as far as I understand it, is to take the default parameters of the package, find the (near) optimal parameters for each dataset in their evaluation and determine how valuable it is to tune a particular parameter.
In doing so they also come up with "optimal defaults" in Table 3, and an interactive Shiny app.
This made me curious about how the defaults for XGBoost were chosen and if it's something that the community would be interested in revisiting in the future.
Reactions are currently unavailable