Skip to content

Commit ef762a0

Browse files
committed
[SPARK-51627][INFRA][PYTHON] Add a scheduled workflow for numpy 2.1.3
### What changes were proposed in this pull request? Add a scheduled (every 3 days) workflow for numpy 2.1.3 ### Why are the changes needed? we see many 3-rd packages not compatible with numpy 2.2, e.g. ``` tensorflow only supports numpy<2.2.0 ydata-profiling 4.16.0 depends on numpy<2.2 and >=1.16.0 numba 0.61.0 depends on numpy<2.2 and >=1.24 ``` Add this scheduled workflow to make sure pyspark works with `numpy<2.2` ### Does this PR introduce _any_ user-facing change? no, infra-only ### How was this patch tested? PR builder with ``` default: '{"PYSPARK_IMAGE_TO_TEST": "numpy-213", "PYTHON_TO_TEST": "python3.11"}' ``` ### Was this patch authored or co-authored using generative AI tooling? no Closes #50426 from zhengruifeng/infra_more_env. Authored-by: Ruifeng Zheng <[email protected]> Signed-off-by: Ruifeng Zheng <[email protected]>
1 parent 714884c commit ef762a0

File tree

3 files changed

+140
-0
lines changed

3 files changed

+140
-0
lines changed

.github/workflows/build_infra_images_cache.yml

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -36,6 +36,7 @@ on:
3636
- 'dev/spark-test-image/python-311/Dockerfile'
3737
- 'dev/spark-test-image/python-312/Dockerfile'
3838
- 'dev/spark-test-image/python-313/Dockerfile'
39+
- 'dev/spark-test-image/numpy-213/Dockerfile'
3940
- '.github/workflows/build_infra_images_cache.yml'
4041
# Create infra image when cutting down branches/tags
4142
create:
@@ -213,3 +214,16 @@ jobs:
213214
- name: Image digest (PySpark with Python 3.13)
214215
if: hashFiles('dev/spark-test-image/python-313/Dockerfile') != ''
215216
run: echo ${{ steps.docker_build_pyspark_python_313.outputs.digest }}
217+
- name: Build and push (PySpark with Numpy 2.1.3)
218+
if: hashFiles('dev/spark-test-image/numpy-213/Dockerfile') != ''
219+
id: docker_build_pyspark_numpy_213
220+
uses: docker/build-push-action@v6
221+
with:
222+
context: ./dev/spark-test-image/numpy-213/
223+
push: true
224+
tags: ghcr.io/apache/spark/apache-spark-github-action-image-pyspark-numpy-213-cache:${{ github.ref_name }}-static
225+
cache-from: type=registry,ref=ghcr.io/apache/spark/apache-spark-github-action-image-pyspark-numpy-213-cache:${{ github.ref_name }}
226+
cache-to: type=registry,ref=ghcr.io/apache/spark/apache-spark-github-action-image-pyspark-numpy-213-cache:${{ github.ref_name }},mode=max
227+
- name: Image digest (PySpark with Numpy 2.1.3)
228+
if: hashFiles('dev/spark-test-image/numpy-213/Dockerfile') != ''
229+
run: echo ${{ steps.docker_build_pyspark_numpy_213.outputs.digest }}
Lines changed: 47 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,47 @@
1+
#
2+
# Licensed to the Apache Software Foundation (ASF) under one
3+
# or more contributor license agreements. See the NOTICE file
4+
# distributed with this work for additional information
5+
# regarding copyright ownership. The ASF licenses this file
6+
# to you under the Apache License, Version 2.0 (the
7+
# "License"); you may not use this file except in compliance
8+
# with the License. You may obtain a copy of the License at
9+
#
10+
# http://www.apache.org/licenses/LICENSE-2.0
11+
#
12+
# Unless required by applicable law or agreed to in writing,
13+
# software distributed under the License is distributed on an
14+
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
15+
# KIND, either express or implied. See the License for the
16+
# specific language governing permissions and limitations
17+
# under the License.
18+
#
19+
20+
name: "Build / Python-only (master, Python 3.11, Numpy 2.1.3)"
21+
22+
on:
23+
schedule:
24+
- cron: '0 3 */3 * *'
25+
workflow_dispatch:
26+
27+
jobs:
28+
run-build:
29+
permissions:
30+
packages: write
31+
name: Run
32+
uses: ./.github/workflows/build_and_test.yml
33+
if: github.repository == 'apache/spark'
34+
with:
35+
java: 17
36+
branch: master
37+
hadoop: hadoop3
38+
envs: >-
39+
{
40+
"PYSPARK_IMAGE_TO_TEST": "numpy-213",
41+
"PYTHON_TO_TEST": "python3.11"
42+
}
43+
jobs: >-
44+
{
45+
"pyspark": "true",
46+
"pyspark-pandas": "true"
47+
}
Lines changed: 79 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,79 @@
1+
#
2+
# Licensed to the Apache Software Foundation (ASF) under one or more
3+
# contributor license agreements. See the NOTICE file distributed with
4+
# this work for additional information regarding copyright ownership.
5+
# The ASF licenses this file to You under the Apache License, Version 2.0
6+
# (the "License"); you may not use this file except in compliance with
7+
# the License. You may obtain a copy of the License at
8+
#
9+
# http://www.apache.org/licenses/LICENSE-2.0
10+
#
11+
# Unless required by applicable law or agreed to in writing, software
12+
# distributed under the License is distributed on an "AS IS" BASIS,
13+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
14+
# See the License for the specific language governing permissions and
15+
# limitations under the License.
16+
#
17+
18+
# Image for building and testing Spark branches. Based on Ubuntu 22.04.
19+
# See also in https://hub.docker.com/_/ubuntu
20+
FROM ubuntu:jammy-20240911.1
21+
LABEL org.opencontainers.image.authors="Apache Spark project <[email protected]>"
22+
LABEL org.opencontainers.image.licenses="Apache-2.0"
23+
LABEL org.opencontainers.image.ref.name="Apache Spark Infra Image For PySpark with Python 3.11 and Numpy 2.1.3"
24+
# Overwrite this label to avoid exposing the underlying Ubuntu OS version label
25+
LABEL org.opencontainers.image.version=""
26+
27+
ENV FULL_REFRESH_DATE=20250327
28+
29+
ENV DEBIAN_FRONTEND=noninteractive
30+
ENV DEBCONF_NONINTERACTIVE_SEEN=true
31+
32+
RUN apt-get update && apt-get install -y \
33+
build-essential \
34+
ca-certificates \
35+
curl \
36+
gfortran \
37+
git \
38+
gnupg \
39+
libcurl4-openssl-dev \
40+
libfontconfig1-dev \
41+
libfreetype6-dev \
42+
libfribidi-dev \
43+
libgit2-dev \
44+
libharfbuzz-dev \
45+
libjpeg-dev \
46+
liblapack-dev \
47+
libopenblas-dev \
48+
libpng-dev \
49+
libpython3-dev \
50+
libssl-dev \
51+
libtiff5-dev \
52+
libxml2-dev \
53+
openjdk-17-jdk-headless \
54+
pkg-config \
55+
qpdf \
56+
tzdata \
57+
software-properties-common \
58+
wget \
59+
zlib1g-dev
60+
61+
# Install Python 3.11
62+
RUN add-apt-repository ppa:deadsnakes/ppa
63+
RUN apt-get update && apt-get install -y \
64+
python3.11 \
65+
&& apt-get autoremove --purge -y \
66+
&& apt-get clean \
67+
&& rm -rf /var/lib/apt/lists/*
68+
69+
70+
# Pin numpy==2.1.3
71+
ARG BASIC_PIP_PKGS="numpy==2.1.3 pyarrow==19.0.0 six==1.16.0 pandas==2.2.3 scipy plotly<6.0.0 mlflow>=2.8.1 coverage matplotlib openpyxl memory-profiler>=0.61.0 scikit-learn>=1.3.2"
72+
# Python deps for Spark Connect
73+
ARG CONNECT_PIP_PKGS="grpcio==1.67.0 grpcio-status==1.67.0 protobuf==5.29.1 googleapis-common-protos==1.65.0 graphviz==0.20.3"
74+
75+
# Install Python 3.11 packages
76+
RUN curl -sS https://bootstrap.pypa.io/get-pip.py | python3.11
77+
RUN python3.11 -m pip install --ignore-installed blinker>=1.6.2 # mlflow needs this
78+
RUN python3.11 -m pip install $BASIC_PIP_PKGS unittest-xml-reporting $CONNECT_PIP_PKGS && \
79+
python3.11 -m pip cache purge

0 commit comments

Comments
 (0)