Make the number of fallback storage sub-directories configurable #439
build_main.yml
on: push
Run
/
Check changes
38s
Run
/
Protobuf breaking change detection and Python CodeGen check
1m 6s
Run
/
Run TPC-DS queries with SF=1
1h 36m
Run
/
Run Docker integration tests
1h 26m
Run
/
Run Spark on Kubernetes Integration test
59m 49s
Run
/
Run Spark UI tests
24s
Matrix: Run / build
Run
/
Build modules: sparkr
30m 47s
Run
/
Linters, licenses, and dependencies
24m 57s
Run
/
Documentation generation
47m 1s
Matrix: Run / pyspark
Annotations
10 errors and 1 warning
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-5880689364d9c5bf-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-4247ca9364daaf42-exec-1".
|
Run / Run Spark on Kubernetes Integration test
sleep interrupted
|
Run / Run Spark on Kubernetes Integration test
sleep interrupted
|
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$628/0x00007f419c4fb5e0@392e70ee rejected from java.util.concurrent.ThreadPoolExecutor@21d760dd[Shutting down, pool size = 1, active threads = 1, queued tasks = 0, completed tasks = 373]
|
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$628/0x00007f419c4fb5e0@67588d92 rejected from java.util.concurrent.ThreadPoolExecutor@21d760dd[Shutting down, pool size = 2, active threads = 2, queued tasks = 0, completed tasks = 372]
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-c547599364ed8c07-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-cb25799364ee74e9-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-4e04569364f21aac-exec-1".
|
Run / Run Spark on Kubernetes Integration test
Status(apiVersion=v1, code=404, details=StatusDetails(causes=[], group=null, kind=pods, name=spark-test-app-9919f03cedfa43609e5b37166b25f205-driver, retryAfterSeconds=null, uid=null, additionalProperties={}), kind=Status, message=pods "spark-test-app-9919f03cedfa43609e5b37166b25f205-driver" not found, metadata=ListMeta(_continue=null, remainingItemCount=null, resourceVersion=null, selfLink=null, additionalProperties={}), reason=NotFound, status=Failure, additionalProperties={})..
|
Run / Build modules: pyspark-core, pyspark-errors, pyspark-streaming
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
|
Artifacts
Produced during runtime
Name | Size | |
---|---|---|
test-results-docker-integration--17-hadoop3-hive2.3
|
42.2 KB |
|
test-results-streaming, sql-kafka-0-10, streaming-kafka-0-10, streaming-kinesis-asl, kubernetes, hadoop-cloud, spark-ganglia-lgpl, protobuf, connect--17-hadoop3-hive2.3
|
374 KB |
|
test-results-yarn--17-hadoop3-hive2.3
|
42.9 KB |
|