Skip to content

[python-package][dask] avoid warning for default n_jobs values#7195

Open
shiyu1994 wants to merge 1 commit intomasterfrom
fix/issue-6797-root
Open

[python-package][dask] avoid warning for default n_jobs values#7195
shiyu1994 wants to merge 1 commit intomasterfrom
fix/issue-6797-root

Conversation

@shiyu1994
Copy link
Copy Markdown
Collaborator

@shiyu1994 shiyu1994 commented Mar 12, 2026

Summary

  • avoid warning in lightgbm.dask._train() when a num_threads alias is present with default sklearn-constructor values (None or -1)
  • keep warning behavior unchanged for other explicit num_threads / num_machines values that are ignored
  • add a Dask test covering both default n_jobs alias values (-1, None)

Testing

  • python -m compileall -q python-package/lightgbm/dask.py tests/python_package_test/test_dask.py
  • targeted test_dask.py run could not be executed here because the available LightGBM test environment lacks dask / distributed and package install is blocked by network proxy

Fixes #6797

Copy link
Copy Markdown
Member

@jameslamb jameslamb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is already an open PR attempting to close this, authored by (as far as I can tell) a human who put effort into it: #6987

I'd prefer to take that PR over this LLM-generated one.

It has been 8 months since they last interacted there though. I just asked them to respond. If we don't hear back soon, I'd be ok with closing that PR and pursuing this one. If we do that, please do see my notes about the tests.



@pytest.mark.parametrize("n_jobs", [-1, None])
def test_does_not_warn_on_default_n_jobs_alias_values(cluster, n_jobs):
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This does not test the behavior described in #6797

  1. doesn't actually check the log output to confirm a warning isn't raised
  2. passes parameter n_jobs which is interesting, but the issue says the warning is raised even when not passing any multithreading related parameters

If we move forward with this PR, this needs to be updated to confirm that the issue is actually fixed.

@jameslamb jameslamb added the fix label Mar 13, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[python-package] [dask] Dask estimators raise an unavoidable warning

2 participants