Skip to content

Commit 7a47aba

Browse files
dongjoon-hyunterana
authored andcommitted
[SPARK-56126][INFRA] Sync docker-related GitHub Actions versions to the ASF approved patterns
### What changes were proposed in this pull request? This PR aims to sync `docker`-related GitHub Actions versions to the ASF approved patterns. ### Why are the changes needed? Currently, the CI is blocked by the ASF check because of the recent change. - https://github.com/apache/spark/actions/workflows/build_main.yml - https://github.com/apache/spark/actions/runs/23362042477 - https://github.com/apache/spark/actions/workflows/build_non_ansi.yml - https://github.com/apache/spark/actions/runs/23369253367 > The actions docker/login-actionv3, docker/setup-qemu-actionv3, docker/setup-buildx-actionv3, and docker/build-push-actionv6 are not allowed in apache/spark because all actions must be from a repository owned by your enterprise, created by GitHub, or match one of the patterns: <img width="905" height="380" alt="Screenshot 2026-03-20 at 20 32 56" src="https://github.com/user-attachments/assets/2582b68a-6303-44ab-b961-d9b753072f1e" /> This is due to the following change. - apache/infrastructure-actions#547 As of now, the updated patterns are the following. - https://github.com/apache/infrastructure-actions/blob/07f5f9d2b05fe0ec9886e3ef0a9d79797817f0cb/approved_patterns.yml#L100-L104 ``` - docker/build-push-action10e90e3645eae34f1e60eeb005ba3a3d33f178e8 - docker/login-actionc94ce9fb468520275223c153574b00df6fe4bcc9 - docker/metadata-actionc299e40c65443455700f0fdfc63efafe5b349051 - docker/setup-buildx-action8d2750c68a42422c14e847fe6c8ac0403b4cbd6f - docker/setup-qemu-action29109295f81e9208d7d86ff1c6c12d2833863392 ``` ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? Manually check like the following because the updated CI should be triggered manually. ``` $ git grep 'uses: docker' | sort | uniq -c 5 .github/workflows/build_and_test.yml: uses: docker/build-push-action10e90e3645eae34f1e60eeb005ba3a3d33f178e8 1 .github/workflows/build_and_test.yml: uses: docker/login-actionc94ce9fb468520275223c153574b00df6fe4bcc9 1 .github/workflows/build_and_test.yml: uses: docker/setup-buildx-action8d2750c68a42422c14e847fe6c8ac0403b4cbd6f 1 .github/workflows/build_and_test.yml: uses: docker/setup-qemu-action29109295f81e9208d7d86ff1c6c12d2833863392 16 .github/workflows/build_infra_images_cache.yml: uses: docker/build-push-action10e90e3645eae34f1e60eeb005ba3a3d33f178e8 1 .github/workflows/build_infra_images_cache.yml: uses: docker/login-actionc94ce9fb468520275223c153574b00df6fe4bcc9 1 .github/workflows/build_infra_images_cache.yml: uses: docker/setup-buildx-action8d2750c68a42422c14e847fe6c8ac0403b4cbd6f 1 .github/workflows/build_infra_images_cache.yml: uses: docker/setup-qemu-action29109295f81e9208d7d86ff1c6c12d2833863392 ``` ### Was this patch authored or co-authored using generative AI tooling? Generated-by: Claude Code (claude-opus-4-6) Closes apache#54935 from dongjoon-hyun/SPARK-56126. Authored-by: Dongjoon Hyun <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]>
1 parent f83d019 commit 7a47aba

File tree

3 files changed

+28
-28
lines changed

3 files changed

+28
-28
lines changed

.github/workflows/build_and_test.yml

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -455,7 +455,7 @@ jobs:
455455
packages: write
456456
steps:
457457
- name: Login to GitHub Container Registry
458-
uses: docker/login-action@v3
458+
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9
459459
with:
460460
registry: ghcr.io
461461
username: ${{ github.actor }}
@@ -475,13 +475,13 @@ jobs:
475475
git -c user.name='Apache Spark Test Account' -c user.email='[email protected]' merge --no-commit --progress --squash FETCH_HEAD
476476
git -c user.name='Apache Spark Test Account' -c user.email='[email protected]' commit -m "Merged commit" --allow-empty
477477
- name: Set up QEMU
478-
uses: docker/setup-qemu-action@v3
478+
uses: docker/setup-qemu-action@29109295f81e9208d7d86ff1c6c12d2833863392
479479
- name: Set up Docker Buildx
480-
uses: docker/setup-buildx-action@v3
480+
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f
481481
- name: Build and push for branch-3.5
482482
if: inputs.branch == 'branch-3.5'
483483
id: docker_build
484-
uses: docker/build-push-action@v6
484+
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8
485485
with:
486486
context: ./dev/infra/
487487
push: true
@@ -492,7 +492,7 @@ jobs:
492492
- name: Build and push (Documentation)
493493
if: ${{ inputs.branch != 'branch-3.5' && fromJson(needs.precondition.outputs.required).docs == 'true' && hashFiles('dev/spark-test-image/docs/Dockerfile') != '' }}
494494
id: docker_build_docs
495-
uses: docker/build-push-action@v6
495+
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8
496496
with:
497497
context: ./dev/spark-test-image/docs/
498498
push: true
@@ -503,7 +503,7 @@ jobs:
503503
- name: Build and push (Linter)
504504
if: ${{ inputs.branch != 'branch-3.5' && fromJson(needs.precondition.outputs.required).lint == 'true' && hashFiles('dev/spark-test-image/lint/Dockerfile') != '' }}
505505
id: docker_build_lint
506-
uses: docker/build-push-action@v6
506+
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8
507507
with:
508508
context: ./dev/spark-test-image/lint/
509509
push: true
@@ -514,7 +514,7 @@ jobs:
514514
- name: Build and push (SparkR)
515515
if: ${{ inputs.branch != 'branch-3.5' && fromJson(needs.precondition.outputs.required).sparkr == 'true' && hashFiles('dev/spark-test-image/sparkr/Dockerfile') != '' }}
516516
id: docker_build_sparkr
517-
uses: docker/build-push-action@v6
517+
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8
518518
with:
519519
context: ./dev/spark-test-image/sparkr/
520520
push: true
@@ -526,7 +526,7 @@ jobs:
526526
if: ${{ inputs.branch != 'branch-3.5' && (fromJson(needs.precondition.outputs.required).pyspark == 'true' || fromJson(needs.precondition.outputs.required).pyspark-pandas == 'true') && env.PYSPARK_IMAGE_TO_TEST != '' }}
527527
id: docker_build_pyspark
528528
env: ${{ fromJSON(inputs.envs) }}
529-
uses: docker/build-push-action@v6
529+
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8
530530
with:
531531
context: ./dev/spark-test-image/${{ env.PYSPARK_IMAGE_TO_TEST }}/
532532
push: true

.github/workflows/build_infra_images_cache.yml

Lines changed: 19 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -56,18 +56,18 @@ jobs:
5656
- name: Checkout Spark repository
5757
uses: actions/checkout@v6
5858
- name: Set up QEMU
59-
uses: docker/setup-qemu-action@v3
59+
uses: docker/setup-qemu-action@29109295f81e9208d7d86ff1c6c12d2833863392
6060
- name: Set up Docker Buildx
61-
uses: docker/setup-buildx-action@v3
61+
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f
6262
- name: Login to DockerHub
63-
uses: docker/login-action@v3
63+
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9
6464
with:
6565
registry: ghcr.io
6666
username: ${{ github.actor }}
6767
password: ${{ secrets.GITHUB_TOKEN }}
6868
- name: Build and push
6969
id: docker_build
70-
uses: docker/build-push-action@v6
70+
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8
7171
with:
7272
context: ./dev/infra/
7373
push: true
@@ -79,7 +79,7 @@ jobs:
7979
- name: Build and push (Documentation)
8080
if: hashFiles('dev/spark-test-image/docs/Dockerfile') != ''
8181
id: docker_build_docs
82-
uses: docker/build-push-action@v6
82+
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8
8383
with:
8484
context: ./dev/spark-test-image/docs/
8585
push: true
@@ -92,7 +92,7 @@ jobs:
9292
- name: Build and push (Linter)
9393
if: hashFiles('dev/spark-test-image/lint/Dockerfile') != ''
9494
id: docker_build_lint
95-
uses: docker/build-push-action@v6
95+
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8
9696
with:
9797
context: ./dev/spark-test-image/lint/
9898
push: true
@@ -105,7 +105,7 @@ jobs:
105105
- name: Build and push (SparkR)
106106
if: hashFiles('dev/spark-test-image/sparkr/Dockerfile') != ''
107107
id: docker_build_sparkr
108-
uses: docker/build-push-action@v6
108+
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8
109109
with:
110110
context: ./dev/spark-test-image/sparkr/
111111
push: true
@@ -118,7 +118,7 @@ jobs:
118118
- name: Build and push (PySpark with old dependencies)
119119
if: hashFiles('dev/spark-test-image/python-minimum/Dockerfile') != ''
120120
id: docker_build_pyspark_python_minimum
121-
uses: docker/build-push-action@v6
121+
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8
122122
with:
123123
context: ./dev/spark-test-image/python-minimum/
124124
push: true
@@ -131,7 +131,7 @@ jobs:
131131
- name: Build and push (PySpark PS with old dependencies)
132132
if: hashFiles('dev/spark-test-image/python-ps-minimum/Dockerfile') != ''
133133
id: docker_build_pyspark_python_ps_minimum
134-
uses: docker/build-push-action@v6
134+
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8
135135
with:
136136
context: ./dev/spark-test-image/python-ps-minimum/
137137
push: true
@@ -144,7 +144,7 @@ jobs:
144144
- name: Build and push (PySpark with PyPy 3.10)
145145
if: hashFiles('dev/spark-test-image/pypy-310/Dockerfile') != ''
146146
id: docker_build_pyspark_pypy_310
147-
uses: docker/build-push-action@v6
147+
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8
148148
with:
149149
context: ./dev/spark-test-image/pypy-310/
150150
push: true
@@ -157,7 +157,7 @@ jobs:
157157
- name: Build and push (PySpark with PyPy 3.11)
158158
if: hashFiles('dev/spark-test-image/pypy-311/Dockerfile') != ''
159159
id: docker_build_pyspark_pypy_311
160-
uses: docker/build-push-action@v6
160+
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8
161161
with:
162162
context: ./dev/spark-test-image/pypy-311/
163163
push: true
@@ -170,7 +170,7 @@ jobs:
170170
- name: Build and push (PySpark with Python 3.10)
171171
if: hashFiles('dev/spark-test-image/python-310/Dockerfile') != ''
172172
id: docker_build_pyspark_python_310
173-
uses: docker/build-push-action@v6
173+
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8
174174
with:
175175
context: ./dev/spark-test-image/python-310/
176176
push: true
@@ -183,7 +183,7 @@ jobs:
183183
- name: Build and push (PySpark with Python 3.11)
184184
if: hashFiles('dev/spark-test-image/python-311/Dockerfile') != ''
185185
id: docker_build_pyspark_python_311
186-
uses: docker/build-push-action@v6
186+
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8
187187
with:
188188
context: ./dev/spark-test-image/python-311/
189189
push: true
@@ -196,7 +196,7 @@ jobs:
196196
- name: Build and push (PySpark Classic Only with Python 3.12)
197197
if: hashFiles('dev/spark-test-image/python-312-classic-only/Dockerfile') != ''
198198
id: docker_build_pyspark_python_312_classic_only
199-
uses: docker/build-push-action@v6
199+
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8
200200
with:
201201
context: ./dev/spark-test-image/python-312-classic-only/
202202
push: true
@@ -209,7 +209,7 @@ jobs:
209209
- name: Build and push (PySpark with Python 3.12)
210210
if: hashFiles('dev/spark-test-image/python-312/Dockerfile') != ''
211211
id: docker_build_pyspark_python_312
212-
uses: docker/build-push-action@v6
212+
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8
213213
with:
214214
context: ./dev/spark-test-image/python-312/
215215
push: true
@@ -222,7 +222,7 @@ jobs:
222222
- name: Build and push (PySpark with Python 3.12 Pandas 3)
223223
if: hashFiles('dev/spark-test-image/python-312-pandas-3/Dockerfile') != ''
224224
id: docker_build_pyspark_python_312_pandas_3
225-
uses: docker/build-push-action@v6
225+
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8
226226
with:
227227
context: ./dev/spark-test-image/python-312-pandas-3/
228228
push: true
@@ -235,7 +235,7 @@ jobs:
235235
- name: Build and push (PySpark with Python 3.13)
236236
if: hashFiles('dev/spark-test-image/python-313/Dockerfile') != ''
237237
id: docker_build_pyspark_python_313
238-
uses: docker/build-push-action@v6
238+
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8
239239
with:
240240
context: ./dev/spark-test-image/python-313/
241241
push: true
@@ -248,7 +248,7 @@ jobs:
248248
- name: Build and push (PySpark with Python 3.14)
249249
if: hashFiles('dev/spark-test-image/python-314/Dockerfile') != ''
250250
id: docker_build_pyspark_python_314
251-
uses: docker/build-push-action@v6
251+
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8
252252
with:
253253
context: ./dev/spark-test-image/python-314/
254254
push: true
@@ -261,7 +261,7 @@ jobs:
261261
- name: Build and push (PySpark with Python 3.14 no GIL)
262262
if: hashFiles('dev/spark-test-image/python-314-nogil/Dockerfile') != ''
263263
id: docker_build_pyspark_python_314_nogil
264-
uses: docker/build-push-action@v6
264+
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8
265265
with:
266266
context: ./dev/spark-test-image/python-314-nogil/
267267
push: true

.github/workflows/test_report.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -42,7 +42,7 @@ jobs:
4242
run-id: ${{ github.event.workflow_run.id }}
4343
pattern: "test-*"
4444
- name: Publish test report
45-
uses: scacap/action-surefire-report@v1
45+
uses: scacap/action-surefire-report@5609ce4db72c09db044803b344a8968fd1f315da
4646
with:
4747
check_name: Report test results
4848
github_token: ${{ secrets.GITHUB_TOKEN }}

0 commit comments

Comments
 (0)