Build / Python-only (master, Python 3.13) #278
build_python_3.13.yml
on: schedule
Run
/
Check changes
36s
Run
/
Protobuf breaking change detection and Python CodeGen check
0s
Run
/
Java 25 build with Maven
0s
Run
/
Run TPC-DS queries with SF=1
0s
Run
/
Run Docker integration tests
0s
Run
/
Run Spark on Kubernetes Integration test
0s
Run
/
Run Spark UI tests
0s
Matrix: Run / build
Run
/
Build modules: sparkr
0s
Run
/
Linters, licenses, and dependencies
Run
/
Documentation generation
0s
Matrix: Run / pyspark
Annotations
2 errors and 2 warnings
Run / Build modules: pyspark-mllib, pyspark-ml, pyspark-ml-connect, pyspark-pipelines
The operation was canceled.
|
Run / Build modules: pyspark-mllib, pyspark-ml, pyspark-ml-connect, pyspark-pipelines
The job has exceeded the maximum execution time of 2h0m0s
|
Run / Build modules: pyspark-core, pyspark-errors, pyspark-streaming, pyspark-logger
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
|
Run / Build modules: pyspark-core, pyspark-errors, pyspark-streaming, pyspark-logger
The 'defaults' channel might have been added implicitly. If this is intentional, add 'defaults' to the 'channels' list. Otherwise, consider setting 'conda-remove-defaults' to 'true'.
|
Artifacts
Produced during runtime
Name | Size | Digest | |
---|---|---|---|
apache~spark~TFEOCK.dockerbuild
|
27.3 KB |
sha256:b175f3547caef8a12d50a4bd3c61db94b1b70f8b4955c53425b7e0b8af3c40ce
|
|
test-results-pyspark-connect--17-hadoop3-hive2.3-python3.13
|
203 KB |
sha256:9440ac1c9c36a01b87d7a14b7fa3c764d49caf77a5cf81b22b403ae38e69dfea
|
|
test-results-pyspark-mllib, pyspark-ml, pyspark-ml-connect, pyspark-pipelines--17-hadoop3-hive2.3-python3.13
|
39.5 KB |
sha256:bc991e21cfeb3f2b390c044c4c205351d437728947127e38186ac1e76da709e3
|
|
test-results-pyspark-pandas--17-hadoop3-hive2.3-python3.13
|
194 KB |
sha256:fb6f1f4de7b9f18fbc4569413fff7971faa64308117f441fae3c410e79089215
|
|
test-results-pyspark-pandas-connect-part0--17-hadoop3-hive2.3-python3.13
|
146 KB |
sha256:5b76e99c4247083bbe2111edb02ea0acfbbc6f943d6b6c7b5336aa9e706d66b9
|
|
test-results-pyspark-pandas-connect-part1--17-hadoop3-hive2.3-python3.13
|
110 KB |
sha256:489555c1b5cd57eb105022c3aef7402c0b5abfb79c5402314955b4d349daf350
|
|
test-results-pyspark-pandas-connect-part2--17-hadoop3-hive2.3-python3.13
|
81.3 KB |
sha256:75de50212017301c062c6ae797b7d24e6f6992d12522754bf8fa215d7e9b1237
|
|
test-results-pyspark-pandas-connect-part3--17-hadoop3-hive2.3-python3.13
|
49.4 KB |
sha256:1bb20151e6c1293a4de905b61e13f2dc14863ce2de2139bea07cd98d5f5232e4
|
|
test-results-pyspark-pandas-slow--17-hadoop3-hive2.3-python3.13
|
173 KB |
sha256:3d5c72b9fc7c9f63e9cce2a68b4df21f221217fc259b1656a30d0571f1c32908
|
|
test-results-pyspark-sql, pyspark-resource, pyspark-testing--17-hadoop3-hive2.3-python3.13
|
217 KB |
sha256:ce56147840502c643dd8328a74ffc10d2ea958f8d51846ed755b1dcc135dd943
|
|
unit-tests-log-pyspark-mllib, pyspark-ml, pyspark-ml-connect, pyspark-pipelines--17-hadoop3-hive2.3-python3.13
|
374 KB |
sha256:5d7a3231d34ca5898f4d5bbf77829c3ec4d7ab3a70e441f3c47db95c649a1bd8
|
|