Build / Python-only (master, Python 3.13) #283
build_python_3.13.yml
on: schedule
Run
/
Check changes
38s
Run
/
Protobuf breaking change detection and Python CodeGen check
0s
Run
/
Java 25 build with Maven
0s
Run
/
Run TPC-DS queries with SF=1
0s
Run
/
Run Docker integration tests
0s
Run
/
Run Spark on Kubernetes Integration test
0s
Run
/
Run Spark UI tests
0s
Matrix: Run / build
Run
/
Build modules: sparkr
0s
Run
/
Linters, licenses, and dependencies
0s
Run
/
Documentation generation
0s
Matrix: Run / pyspark
Annotations
2 errors and 2 warnings
Run / Build modules: pyspark-mllib, pyspark-ml, pyspark-ml-connect, pyspark-pipelines
The operation was canceled.
|
Run / Build modules: pyspark-mllib, pyspark-ml, pyspark-ml-connect, pyspark-pipelines
The job has exceeded the maximum execution time of 2h0m0s
|
Run / Build modules: pyspark-core, pyspark-errors, pyspark-streaming, pyspark-logger
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
|
Run / Build modules: pyspark-core, pyspark-errors, pyspark-streaming, pyspark-logger
The 'defaults' channel might have been added implicitly. If this is intentional, add 'defaults' to the 'channels' list. Otherwise, consider setting 'conda-remove-defaults' to 'true'.
|
Artifacts
Produced during runtime
Name | Size | Digest | |
---|---|---|---|
apache~spark~DY470W.dockerbuild
|
27.3 KB |
sha256:5184c3064d3c340691a3ea51b3317008485350969731a75ded747e33bcfa1c90
|
|
test-results-pyspark-connect--17-hadoop3-hive2.3-python3.13
|
203 KB |
sha256:0eeea0a95ffa3afb137a9f96cdcd2069775c275033520e09894e31e7725edc91
|
|
test-results-pyspark-mllib, pyspark-ml, pyspark-ml-connect, pyspark-pipelines--17-hadoop3-hive2.3-python3.13
|
41.1 KB |
sha256:103bf4460aa3c189087b39cb38d1d4e4235db6185f05a4050eb419ce472c3f66
|
|
test-results-pyspark-pandas--17-hadoop3-hive2.3-python3.13
|
194 KB |
sha256:053e77ce87491e3039cc61530d508f4e206563977b327725a6eaf8a42842ee05
|
|
test-results-pyspark-pandas-connect-part0--17-hadoop3-hive2.3-python3.13
|
146 KB |
sha256:f47f0d0037bf8a10f80922132518907b4e69ad8d81d4fe0a1c5480290c9feb0d
|
|
test-results-pyspark-pandas-connect-part1--17-hadoop3-hive2.3-python3.13
|
110 KB |
sha256:96ad2791ee4878666a3ad401c6891677b3bef1ded6a995ff09b6935e01568ebc
|
|
test-results-pyspark-pandas-connect-part2--17-hadoop3-hive2.3-python3.13
|
81.3 KB |
sha256:a5accb0bdc0b2ce245de9d4b2be0630a9256b51bff96fc87a6b31d3e23307d19
|
|
test-results-pyspark-pandas-connect-part3--17-hadoop3-hive2.3-python3.13
|
49.4 KB |
sha256:0e76c04099c9f7aed6ea93af73f06066e80247c08c45739d43ebfdaf06339e9f
|
|
test-results-pyspark-pandas-slow--17-hadoop3-hive2.3-python3.13
|
173 KB |
sha256:1866cdf94215c5f60479aceb612894c87de9618e215d19b73cddf14e0e0913ed
|
|
test-results-pyspark-sql, pyspark-resource, pyspark-testing--17-hadoop3-hive2.3-python3.13
|
218 KB |
sha256:e3c004ab8d2830b9500a5bdee099a714c7a3f1fd99061821625b6a2e754bf794
|
|
unit-tests-log-pyspark-mllib, pyspark-ml, pyspark-ml-connect, pyspark-pipelines--17-hadoop3-hive2.3-python3.13
|
621 KB |
sha256:03e2263344f5c421b01dcd7c5778e41ed26d2c6d918a584b7f2b059717b718f1
|
|