Skip to content

Revisiting default parameter settings? #4986

@thvasilo

Description

@thvasilo

Hello all,

I came upon a recent JMLR paper that examined the "tunability" of the hyperparameters of multiple algorithms, including XGBoost.

Their methodology, as far as I understand it, is to take the default parameters of the package, find the (near) optimal parameters for each dataset in their evaluation and determine how valuable it is to tune a particular parameter.

In doing so they also come up with "optimal defaults" in Table 3, and an interactive Shiny app.

This made me curious about how the defaults for XGBoost were chosen and if it's something that the community would be interested in revisiting in the future.

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions