Skip to content

Commit c4f62d4

Browse files
LuciferYangzhengruifeng
authored andcommitted
[SPARK-55191][INFRA][DOCS] Adjust the Python version used in the Python-only daily test
### What changes were proposed in this pull request? This pr aims to adjust the Python version used in the Python-only daily test: 1. Adjust the daily tests using `python_hosted_runner_test.yml` to Python 3.12, including the Python ARM test and Python macOS 26 test. 2. Update the Python classic-only test to Python 3.12 and modify the Dockerfile generation process. 3. Update the Python-only test from Python 3.12 to Python 3.11 to ensure at least one Python-only daily test is validating Python 3.11. ### Why are the changes needed? Adjust the Python version used in the Python-only daily test ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? - Pass Github Actions ### Was this patch authored or co-authored using generative AI tooling? No Closes #53973 from LuciferYang/SPARK-55191. Authored-by: yangjie01 <[email protected]> Signed-off-by: Ruifeng Zheng <[email protected]>
1 parent baaa62b commit c4f62d4

File tree

8 files changed

+35
-35
lines changed

8 files changed

+35
-35
lines changed

.github/workflows/build_infra_images_cache.yml

Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -36,8 +36,8 @@ on:
3636
- 'dev/spark-test-image/pypy-311/Dockerfile'
3737
- 'dev/spark-test-image/python-310/Dockerfile'
3838
- 'dev/spark-test-image/python-311/Dockerfile'
39-
- 'dev/spark-test-image/python-311-classic-only/Dockerfile'
4039
- 'dev/spark-test-image/python-312/Dockerfile'
40+
- 'dev/spark-test-image/python-312-classic-only/Dockerfile'
4141
- 'dev/spark-test-image/python-312-pandas-3/Dockerfile'
4242
- 'dev/spark-test-image/python-313/Dockerfile'
4343
- 'dev/spark-test-image/python-314/Dockerfile'
@@ -193,19 +193,19 @@ jobs:
193193
- name: Image digest (PySpark with Python 3.11)
194194
if: hashFiles('dev/spark-test-image/python-311/Dockerfile') != ''
195195
run: echo ${{ steps.docker_build_pyspark_python_311.outputs.digest }}
196-
- name: Build and push (PySpark Classic Only with Python 3.11)
197-
if: hashFiles('dev/spark-test-image/python-311-classic-only/Dockerfile') != ''
198-
id: docker_build_pyspark_python_311_classic_only
196+
- name: Build and push (PySpark Classic Only with Python 3.12)
197+
if: hashFiles('dev/spark-test-image/python-312-classic-only/Dockerfile') != ''
198+
id: docker_build_pyspark_python_312_classic_only
199199
uses: docker/build-push-action@v6
200200
with:
201-
context: ./dev/spark-test-image/python-311-classic-only/
201+
context: ./dev/spark-test-image/python-312-classic-only/
202202
push: true
203-
tags: ghcr.io/apache/spark/apache-spark-github-action-image-pyspark-python-311-classic-only-cache:${{ github.ref_name }}-static
204-
cache-from: type=registry,ref=ghcr.io/apache/spark/apache-spark-github-action-image-pyspark-python-311-classic-only-cache:${{ github.ref_name }}
205-
cache-to: type=registry,ref=ghcr.io/apache/spark/apache-spark-github-action-image-pyspark-python-311-classic-only-cache:${{ github.ref_name }},mode=max
206-
- name: Image digest (PySpark Classic Only with Python 3.11)
207-
if: hashFiles('dev/spark-test-image/python-311-classic-only/Dockerfile') != ''
208-
run: echo ${{ steps.docker_build_pyspark_python_311_classic_only.outputs.digest }}
203+
tags: ghcr.io/apache/spark/apache-spark-github-action-image-pyspark-python-312-classic-only-cache:${{ github.ref_name }}-static
204+
cache-from: type=registry,ref=ghcr.io/apache/spark/apache-spark-github-action-image-pyspark-python-312-classic-only-cache:${{ github.ref_name }}
205+
cache-to: type=registry,ref=ghcr.io/apache/spark/apache-spark-github-action-image-pyspark-python-312-classic-only-cache:${{ github.ref_name }},mode=max
206+
- name: Image digest (PySpark Classic Only with Python 3.12)
207+
if: hashFiles('dev/spark-test-image/python-312-classic-only/Dockerfile') != ''
208+
run: echo ${{ steps.docker_build_pyspark_python_312_classic_only.outputs.digest }}
209209
- name: Build and push (PySpark with Python 3.12)
210210
if: hashFiles('dev/spark-test-image/python-312/Dockerfile') != ''
211211
id: docker_build_pyspark_python_312
Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@
1717
# under the License.
1818
#
1919

20-
name: "Build / Python-only (master, Python 3.12)"
20+
name: "Build / Python-only (master, Python 3.11)"
2121

2222
on:
2323
schedule:
@@ -37,8 +37,8 @@ jobs:
3737
hadoop: hadoop3
3838
envs: >-
3939
{
40-
"PYSPARK_IMAGE_TO_TEST": "python-312",
41-
"PYTHON_TO_TEST": "python3.12"
40+
"PYSPARK_IMAGE_TO_TEST": "python-311",
41+
"PYTHON_TO_TEST": "python3.11"
4242
}
4343
jobs: >-
4444
{

.github/workflows/build_python_3.11_arm.yml renamed to .github/workflows/build_python_3.12_arm.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@
1717
# under the License.
1818
#
1919

20-
name: "Build / Python-only (master, Python 3.11, ARM)"
20+
name: "Build / Python-only (master, Python 3.12, ARM)"
2121

2222
on:
2323
schedule:

.github/workflows/build_python_3.11_classic_only.yml renamed to .github/workflows/build_python_3.12_classic_only.yml

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@
1717
# under the License.
1818
#
1919

20-
name: "Build / Python-only, Classic-only (master, Python 3.11)"
20+
name: "Build / Python-only, Classic-only (master, Python 3.12)"
2121

2222
on:
2323
schedule:
@@ -37,8 +37,8 @@ jobs:
3737
hadoop: hadoop3
3838
envs: >-
3939
{
40-
"PYSPARK_IMAGE_TO_TEST": "python-311-classic-only",
41-
"PYTHON_TO_TEST": "python3.11"
40+
"PYSPARK_IMAGE_TO_TEST": "python-312-classic-only",
41+
"PYTHON_TO_TEST": "python3.12"
4242
}
4343
jobs: >-
4444
{

.github/workflows/build_python_3.11_macos26.yml renamed to .github/workflows/build_python_3.12_macos26.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@
1717
# under the License.
1818
#
1919

20-
name: "Build / Python-only (master, Python 3.11, MacOS26)"
20+
name: "Build / Python-only (master, Python 3.12, MacOS26)"
2121

2222
on:
2323
schedule:

.github/workflows/python_hosted_runner_test.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@ on:
2929
python:
3030
required: false
3131
type: string
32-
default: 3.11
32+
default: 3.12
3333
branch:
3434
description: Branch to run the build against
3535
required: false

README.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -36,12 +36,12 @@ This README file only contains basic setup instructions.
3636
| | [![GitHub Actions Build](https://github.com/apache/spark/actions/workflows/build_maven_java21_arm.yml/badge.svg)](https://github.com/apache/spark/actions/workflows/build_maven_java21_arm.yml) |
3737
| | [![GitHub Actions Build](https://github.com/apache/spark/actions/workflows/build_coverage.yml/badge.svg)](https://github.com/apache/spark/actions/workflows/build_coverage.yml) |
3838
| | [![GitHub Actions Build](https://github.com/apache/spark/actions/workflows/build_python_pypy3.10.yml/badge.svg)](https://github.com/apache/spark/actions/workflows/build_python_pypy3.10.yml) |
39-
| | [![GitHub Actions Build](https://github.com/apache/spark/actions/workflows/build_python_pypy3.11.yml/badge.svg)](https://github.com/apache/spark/actions/workflows/build_python_pypy3.11.yml) |
4039
| | [![GitHub Actions Build](https://github.com/apache/spark/actions/workflows/build_python_3.10.yml/badge.svg)](https://github.com/apache/spark/actions/workflows/build_python_3.10.yml) |
41-
| | [![GitHub Actions Build](https://github.com/apache/spark/actions/workflows/build_python_3.11_classic_only.yml/badge.svg)](https://github.com/apache/spark/actions/workflows/build_python_3.11_classic_only.yml) |
42-
| | [![GitHub Actions Build](https://github.com/apache/spark/actions/workflows/build_python_3.11_arm.yml/badge.svg)](https://github.com/apache/spark/actions/workflows/build_python_3.11_arm.yml) |
43-
| | [![GitHub Actions Build](https://github.com/apache/spark/actions/workflows/build_python_3.11_macos26.yml/badge.svg)](https://github.com/apache/spark/actions/workflows/build_python_3.11_macos26.yml) |
44-
| | [![GitHub Actions Build](https://github.com/apache/spark/actions/workflows/build_python_3.12.yml/badge.svg)](https://github.com/apache/spark/actions/workflows/build_python_3.12.yml) |
40+
| | [![GitHub Actions Build](https://github.com/apache/spark/actions/workflows/build_python_3.11.yml/badge.svg)](https://github.com/apache/spark/actions/workflows/build_python_3.11.yml) |
41+
| | [![GitHub Actions Build](https://github.com/apache/spark/actions/workflows/build_python_pypy3.11.yml/badge.svg)](https://github.com/apache/spark/actions/workflows/build_python_pypy3.11.yml) |
42+
| | [![GitHub Actions Build](https://github.com/apache/spark/actions/workflows/build_python_3.12_classic_only.yml/badge.svg)](https://github.com/apache/spark/actions/workflows/build_python_3.12_classic_only.yml) |
43+
| | [![GitHub Actions Build](https://github.com/apache/spark/actions/workflows/build_python_3.12_arm.yml/badge.svg)](https://github.com/apache/spark/actions/workflows/build_python_3.12_arm.yml) |
44+
| | [![GitHub Actions Build](https://github.com/apache/spark/actions/workflows/build_python_3.12_macos26.yml/badge.svg)](https://github.com/apache/spark/actions/workflows/build_python_3.12_macos26.yml) |
4545
| | [![GitHub Actions Build](https://github.com/apache/spark/actions/workflows/build_python_3.12_pandas_3.yml/badge.svg)](https://github.com/apache/spark/actions/workflows/build_python_3.12_pandas_3.yml) |
4646
| | [![GitHub Actions Build](https://github.com/apache/spark/actions/workflows/build_python_3.13.yml/badge.svg)](https://github.com/apache/spark/actions/workflows/build_python_3.13.yml) |
4747
| | [![GitHub Actions Build](https://github.com/apache/spark/actions/workflows/build_python_3.14.yml/badge.svg)](https://github.com/apache/spark/actions/workflows/build_python_3.14.yml) |

dev/spark-test-image/python-311-classic-only/Dockerfile renamed to dev/spark-test-image/python-312-classic-only/Dockerfile

Lines changed: 10 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@
2020
FROM ubuntu:jammy-20240911.1
2121
LABEL org.opencontainers.image.authors="Apache Spark project <[email protected]>"
2222
LABEL org.opencontainers.image.licenses="Apache-2.0"
23-
LABEL org.opencontainers.image.ref.name="Apache Spark Infra Image For PySpark Classic with Python 3.11"
23+
LABEL org.opencontainers.image.ref.name="Apache Spark Infra Image For PySpark Classic with Python 3.12"
2424
# Overwrite this label to avoid exposing the underlying Ubuntu OS version label
2525
LABEL org.opencontainers.image.version=""
2626

@@ -59,10 +59,10 @@ RUN apt-get update && apt-get install -y \
5959
wget \
6060
zlib1g-dev
6161

62-
# Install Python 3.11
62+
# Install Python 3.12
6363
RUN add-apt-repository ppa:deadsnakes/ppa
6464
RUN apt-get update && apt-get install -y \
65-
python3.11 \
65+
python3.12 \
6666
&& apt-get autoremove --purge -y \
6767
&& apt-get clean \
6868
&& rm -rf /var/lib/apt/lists/*
@@ -71,10 +71,10 @@ RUN apt-get update && apt-get install -y \
7171
ARG BASIC_PIP_PKGS="numpy pyarrow>=22.0.0 pandas==2.3.3 plotly<6.0.0 matplotlib openpyxl memory-profiler>=0.61.0 mlflow>=2.8.1 scipy scikit-learn>=1.3.2"
7272
ARG TEST_PIP_PKGS="coverage unittest-xml-reporting"
7373

74-
# Install Python 3.11 packages
75-
RUN curl -sS https://bootstrap.pypa.io/get-pip.py | python3.11
76-
RUN python3.11 -m pip install --ignore-installed 'blinker>=1.6.2' # mlflow needs this
77-
RUN python3.11 -m pip install $BASIC_PIP_PKGS $TEST_PIP_PKGS && \
78-
python3.11 -m pip install torch torchvision --index-url https://download.pytorch.org/whl/cpu && \
79-
python3.11 -m pip install deepspeed torcheval && \
80-
python3.11 -m pip cache purge
74+
# Install Python 3.12 packages
75+
RUN curl -sS https://bootstrap.pypa.io/get-pip.py | python3.12
76+
RUN python3.12 -m pip install --ignore-installed 'blinker>=1.6.2' # mlflow needs this
77+
RUN python3.12 -m pip install $BASIC_PIP_PKGS $TEST_PIP_PKGS && \
78+
python3.12 -m pip install torch torchvision --index-url https://download.pytorch.org/whl/cpu && \
79+
python3.12 -m pip install deepspeed torcheval && \
80+
python3.12 -m pip cache purge

0 commit comments

Comments
 (0)