Skip to content

Commit cbcee8c

Browse files
LuciferYangdongjoon-hyun
authored andcommitted
[SPARK-55986][PYTHON] Upgrade black to 26.3.1
### What changes were proposed in this pull request? This pr aims to upgrade black from 23.12.1 to 26.3.1 ### Why are the changes needed? To fix https://github.com/apache/spark/security/dependabot/172 ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? Pass Github Actions ### Was this patch authored or co-authored using generative AI tooling? No Closes #54782 from LuciferYang/black-26.3.1. Authored-by: yangjie01 <yangjie01@baidu.com> Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
1 parent 73dd6ed commit cbcee8c

File tree

273 files changed

+1970
-2350
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

273 files changed

+1970
-2350
lines changed

.github/workflows/build_and_test.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -842,7 +842,7 @@ jobs:
842842
python-version: '3.12'
843843
- name: Install dependencies for Python CodeGen check
844844
run: |
845-
python3.12 -m pip install 'black==23.12.1' 'protobuf==6.33.5' 'mypy==1.8.0' 'mypy-protobuf==3.3.0'
845+
python3.12 -m pip install 'black==26.3.1' 'protobuf==6.33.5' 'mypy==1.8.0' 'mypy-protobuf==3.3.0'
846846
python3.12 -m pip list
847847
- name: Python CodeGen check for branch-3.5
848848
if: inputs.branch == 'branch-3.5'
@@ -944,7 +944,7 @@ jobs:
944944
run: |
945945
# SPARK-45212: Copy from https://github.com/apache/spark/blob/555c8def51e5951c7bf5165a332795e9e330ec9d/.github/workflows/build_and_test.yml#L631-L638
946946
# Should delete this section after SPARK 3.5 EOL.
947-
python3.9 -m pip install 'flake8==3.9.0' pydata_sphinx_theme 'mypy==0.982' 'pytest==7.1.3' 'pytest-mypy-plugins==1.9.3' numpydoc 'jinja2<3.0.0' 'black==22.6.0'
947+
python3.9 -m pip install 'flake8==3.9.0' pydata_sphinx_theme 'mypy==0.982' 'pytest==7.1.3' 'pytest-mypy-plugins==1.9.3' numpydoc 'jinja2<3.0.0' 'black==26.3.1'
948948
python3.9 -m pip install 'pandas-stubs==1.2.0.53' ipython 'grpcio==1.56.0' 'grpc-stubs==1.24.11' 'googleapis-common-protos-stubs==2.2.0'
949949
- name: List Python packages
950950
shell: 'script -q -e -c "bash {0}"'

.github/workflows/pages.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -62,7 +62,7 @@ jobs:
6262
run: |
6363
pip install 'sphinx==4.5.0' mkdocs 'pydata_sphinx_theme>=0.13' sphinx-copybutton nbsphinx numpydoc jinja2 markupsafe 'pyzmq<24.0.0' \
6464
ipython ipython_genutils sphinx_plotly_directive 'numpy>=1.22' pyarrow 'pandas==2.3.3' 'plotly>=4.8' 'docutils<0.18.0' \
65-
'flake8==3.9.0' 'mypy==1.8.0' 'pytest==7.1.3' 'pytest-mypy-plugins==1.9.3' 'black==23.12.1' \
65+
'flake8==3.9.0' 'mypy==1.8.0' 'pytest==7.1.3' 'pytest-mypy-plugins==1.9.3' 'black==26.3.1' \
6666
'pandas-stubs==1.2.0.53' 'grpcio==1.76.0' 'grpcio-status==1.76.0' 'protobuf==6.33.5' 'grpc-stubs==1.24.11' 'googleapis-common-protos-stubs==2.2.0' \
6767
'sphinxcontrib-applehelp==1.0.4' 'sphinxcontrib-devhelp==1.0.2' 'sphinxcontrib-htmlhelp==2.0.1' 'sphinxcontrib-qthelp==1.0.3' 'sphinxcontrib-serializinghtml==1.1.5'
6868
- name: Install Ruby for documentation generation

dev/create-release/spark-rm/Dockerfile

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -61,7 +61,7 @@ RUN python3.10 -m pip install 'sphinx==4.5.0' mkdocs 'pydata_sphinx_theme>=0.13'
6161
sphinx-copybutton nbsphinx numpydoc jinja2 markupsafe 'pyzmq<24.0.0' \
6262
ipython ipython_genutils sphinx_plotly_directive 'numpy>=1.22' pyarrow pandas \
6363
'plotly>=4.8' 'docutils<0.18.0' 'flake8==3.9.0' 'mypy==1.19.1' 'pytest==7.1.3' \
64-
'pytest-mypy-plugins==1.9.3' 'black==23.12.1' 'pandas-stubs==1.2.0.53' \
64+
'pytest-mypy-plugins==1.9.3' 'black==26.3.1' 'pandas-stubs==1.2.0.53' \
6565
'grpcio==1.76.0' 'grpc-stubs==1.24.11' 'googleapis-common-protos-stubs==2.2.0' \
6666
'sphinxcontrib-applehelp==1.0.4' 'sphinxcontrib-devhelp==1.0.2' \
6767
'sphinxcontrib-htmlhelp==2.0.1' 'sphinxcontrib-qthelp==1.0.3' \

dev/merge_spark_pr.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -345,7 +345,7 @@ def resolve_jira_issue(merge_branches, comment, default_jira_id=""):
345345
# In this case, if the PR is committed to the master branch and the release branch, we
346346
# only consider the release branch to be the fix version. E.g. it is not valid to have
347347
# both 1.1.0 and 1.0.0 as fix versions.
348-
(major, minor, patch) = v.split(".")
348+
major, minor, patch = v.split(".")
349349
if patch == "0":
350350
previous = "%s.%s.%s" % (major, int(minor) - 1, 0)
351351
if previous in default_fix_versions:
@@ -737,7 +737,7 @@ def main():
737737
if __name__ == "__main__":
738738
import doctest
739739

740-
(failure_count, test_count) = doctest.testmod()
740+
failure_count, test_count = doctest.testmod()
741741
if failure_count:
742742
sys.exit(-1)
743743
try:

dev/reformat-python

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@ FWDIR="$( cd "$DIR"/.. && pwd )"
2222
cd "$FWDIR"
2323

2424
BLACK_BUILD="${PYTHON_EXECUTABLE} -m black"
25-
BLACK_VERSION="23.12.1"
25+
BLACK_VERSION="26.3.1"
2626
$PYTHON_EXECUTABLE -c 'import black' 2> /dev/null
2727
if [ $? -ne 0 ]; then
2828
echo "The Python library providing the 'black' module was not found. Please install Black, for example, via 'pip install black==$BLACK_VERSION'."

dev/requirements.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -59,7 +59,7 @@ jira>=3.5.2
5959
PyGithub
6060

6161
# pandas API on Spark Code formatter.
62-
black==23.12.1
62+
black==26.3.1
6363
py
6464

6565
# Spark Connect (required)

dev/spark-test-image/connect-gen-protos/Dockerfile

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -53,7 +53,7 @@ ENV PATH="$VIRTUAL_ENV/bin:$PATH"
5353
RUN python3.12 -m pip install \
5454
'mypy==1.19.1' \
5555
'mypy-protobuf==3.3.0' \
56-
'black==23.12.1'
56+
'black==26.3.1'
5757

5858
# Mount the Spark repo at /spark
5959
WORKDIR /spark

dev/spark-test-image/docs/Dockerfile

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -89,7 +89,7 @@ ENV PATH="$VIRTUAL_ENV/bin:$PATH"
8989
# See 'docutils<0.18.0' in SPARK-39421
9090
RUN python3.12 -m pip install 'sphinx==4.5.0' mkdocs 'pydata_sphinx_theme>=0.13' sphinx-copybutton nbsphinx numpydoc jinja2 markupsafe \
9191
ipython ipython_genutils sphinx_plotly_directive 'numpy>=1.22' 'pyarrow>=23.0.0' 'pandas==2.3.3' 'plotly>=4.8' 'docutils<0.18.0' \
92-
'flake8==3.9.0' 'mypy==1.19.1' 'pytest==7.1.3' 'pytest-mypy-plugins==1.9.3' 'black==23.12.1' \
92+
'flake8==3.9.0' 'mypy==1.19.1' 'pytest==7.1.3' 'pytest-mypy-plugins==1.9.3' 'black==26.3.1' \
9393
'pandas-stubs==1.2.0.53' 'grpcio==1.76.0' 'grpcio-status==1.76.0' 'protobuf==6.33.5' 'grpc-stubs==1.24.11' 'googleapis-common-protos-stubs==2.2.0' \
9494
'sphinxcontrib-applehelp==1.0.4' 'sphinxcontrib-devhelp==1.0.2' 'sphinxcontrib-htmlhelp==2.0.1' 'sphinxcontrib-qthelp==1.0.3' 'sphinxcontrib-serializinghtml==1.1.5' \
9595
&& python3.12 -m pip cache purge

dev/spark-test-image/lint/Dockerfile

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -77,7 +77,7 @@ RUN python3.12 -m venv $VIRTUAL_ENV
7777
ENV PATH="$VIRTUAL_ENV/bin:$PATH"
7878

7979
RUN python3.12 -m pip install \
80-
'black==23.12.1' \
80+
'black==26.3.1' \
8181
'flake8==3.9.0' \
8282
'ruff==0.14.8' \
8383
'googleapis-common-protos-stubs==2.2.0' \

dev/sparktestsupport/modules.py

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -253,9 +253,9 @@ def __hash__(self):
253253
sbt_test_goals=[
254254
"catalyst/test",
255255
],
256-
environ=None
257-
if "GITHUB_ACTIONS" not in os.environ
258-
else {"ENABLE_DOCKER_INTEGRATION_TESTS": "1"},
256+
environ=(
257+
None if "GITHUB_ACTIONS" not in os.environ else {"ENABLE_DOCKER_INTEGRATION_TESTS": "1"}
258+
),
259259
)
260260

261261
sql = Module(
@@ -268,9 +268,9 @@ def __hash__(self):
268268
sbt_test_goals=[
269269
"sql/test",
270270
],
271-
environ=None
272-
if "GITHUB_ACTIONS" not in os.environ
273-
else {"ENABLE_DOCKER_INTEGRATION_TESTS": "1"},
271+
environ=(
272+
None if "GITHUB_ACTIONS" not in os.environ else {"ENABLE_DOCKER_INTEGRATION_TESTS": "1"}
273+
),
274274
)
275275

276276
hive = Module(
@@ -1688,9 +1688,9 @@ def __hash__(self):
16881688
build_profile_flags=["-Pdocker-integration-tests"],
16891689
source_file_regexes=["connector/docker-integration-tests"],
16901690
sbt_test_goals=["docker-integration-tests/test"],
1691-
environ=None
1692-
if "GITHUB_ACTIONS" not in os.environ
1693-
else {"ENABLE_DOCKER_INTEGRATION_TESTS": "1"},
1691+
environ=(
1692+
None if "GITHUB_ACTIONS" not in os.environ else {"ENABLE_DOCKER_INTEGRATION_TESTS": "1"}
1693+
),
16941694
test_tags=["org.apache.spark.tags.DockerTest"],
16951695
)
16961696

0 commit comments

Comments
 (0)