Skip to content

[SPARK-55108][PYTHON] Use the latest pandas-stubs for type check #436

[SPARK-55108][PYTHON] Use the latest pandas-stubs for type check

[SPARK-55108][PYTHON] Use the latest pandas-stubs for type check #436

Triggered via push January 21, 2026 07:07
Status Success
Total duration 3m 52s
Artifacts 16
Fit to window
Zoom out
Zoom in

Annotations

2 errors
DynamicPartitionPruningHiveScanSuiteAEOn.broadcast a single key in a HashedRelation: org/apache/spark/sql/hive/DynamicPartitionPruningHiveScanSuiteAEOn#L1
org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: Exception when loading 100 in table fact with loadPath=file:/home/runner/work/spark/spark/target/tmp/warehouse-a3aebe35-5a2a-47cd-b764-7ec835e63359/fact/.hive-staging_hive_2026-01-20_23-27-17_747_7975136007018106446-1/-ext-10000
AdaptiveQueryExecSuite.SPARK-47148: AQE should avoid to submit shuffle job on cancellation: org/apache/spark/sql/execution/adaptive/AdaptiveQueryExecSuite#L929
org.scalatest.exceptions.TestFailedException: errMsgList.exists(((x$25: String) => x$25.contains("coalesce test error"))) was false The error message should contain 'coalesce test error', but got: ====== [SCALAR_SUBQUERY_TOO_MANY_ROWS] More than one row returned by a subquery used as an expression. SQLSTATE: 21000 == SQL (line 1, position 12) == SELECT id, (SELECT slow_udf() FROM range(2)) FROM range(5) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Full stacktrace of original doTryWithCallerStacktrace caller ======

Artifacts

Produced during runtime
Name Size Digest
apache~spark~3019M8.dockerbuild
30.3 KB
sha256:58793059a88f6e6a68c147f0b1e26c551ab875db629ecbc2b3948a9fab96f2fa
apache~spark~361COS.dockerbuild
29.9 KB
sha256:2be070bbe21783b901a3efa37a305343094279dde2d3e37e641e0ba06597c198
apache~spark~56NMCI.dockerbuild
59.4 KB
sha256:937fa93d1aad4e362ac8951d9b9493e0d539cee81d3cbaac42847bfd9068c10c
apache~spark~API6HL.dockerbuild
33.2 KB
sha256:17b82cade0e197701b9c900cf9f3a01f48504e72b62b17da6fd0745235815a86
apache~spark~IA3T0R.dockerbuild
29.1 KB
sha256:91e557d0db98b6d404e68234ed701737a4761050886eae577f6ea9d65caaf4d2
apache~spark~L8Q4H8.dockerbuild
27.9 KB
sha256:d14a8a490d25a8d9b462240cbfd29cf57f8bb95feccc9eddfdf5065dd01beae5
apache~spark~MJTUE8.dockerbuild
30.5 KB
sha256:420e897193367e3522b813f9840a37e8a717886e4402a4868665582c2f1c4f94
apache~spark~N1ROO7.dockerbuild
55.2 KB
sha256:3743fc039665f9f8739359d4496c602cfd7ef2434a25481a11e67369fc88c0b5
apache~spark~P498R8.dockerbuild
28.1 KB
sha256:5c3a94c9db2283e0ce849a37a268262d956f37e1f744a7a437a1c371c6353cda
apache~spark~PSKM4F.dockerbuild
26.7 KB
sha256:c082cce0f3e06c4bc0290163cac710160b83651fb18c24558ce0f7751062afaf
apache~spark~RGSND2.dockerbuild
27.1 KB
sha256:3cb803e4db4120fece9a8725d6dfc4a33a73e7885a236905e2e00d787a384802
apache~spark~S496BV.dockerbuild
29.9 KB
sha256:99bfdb3a3aa1046aa9bca99227c0a3cc0f1cc04ab9c27ea5822f33c0a4326b80
apache~spark~VB9C76.dockerbuild
31 KB
sha256:42ab9636ba6d270b6bf6eadaa54512af3f74e40dea4579117c13eefe7f82666a
apache~spark~VOEZFX.dockerbuild
29.9 KB
sha256:e7f0afcc69b21d127c79c2c163165e3aee7ced848b7dfbc56d1826e287c02e53
apache~spark~XKS52B.dockerbuild
30.4 KB
sha256:59e2ac782b71ac4d73ebf516fe072c8f106b0d98ed1b5e4af75f3629167187c1
apache~spark~YL3NBE.dockerbuild
29.7 KB
sha256:f59d894207e9064c58b23dff004fb1b98a2924ad5ae6c7302918577e99b0318e