You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Pushing the segment files generated through spark with cluster mode/standalone job are failing with "Could not find paramName tableName in path or query params of the API v2/segmments" with s3 as a deep store
#14407
Open
chrajeshbabu opened this issue
Nov 7, 2024
· 0 comments
When s3 as a deep store enabled and loading data through spark or standalone getting following error.
2024/11/07 10:43:41.262 INFO [HttpClient] [Executor task launch worker for task 0.0 in stage 1.0 (TID 31)] Sending request: http://host1:8556/v2/segments to controller: host1.visa.com, version: Unknown
2024/11/07 10:43:41.263 WARN [SegmentPushUtils] [Executor task launch worker for task 0.0 in stage 1.0 (TID 31)] Caught temporary exception while pushing table: airlineStats segment uri: s3a://projects/data/pinot/examples/output/airlineStats/segments/2014/01/01/airlineStats_batch_2014-01-01_2014-01-01.tar.gz to http://host1:8556, will retry
org.apache.pinot.common.exception.HttpErrorStatusException: Got error status code: 500 (Internal Server Error) with reason: "Could not find paramName tableName in path or query params of the API: http://host1:8556/v2/segments" while sending request: http://host1:8556/v2/segments to controller: host1.visa.com, version: Unknown
at org.apache.pinot.common.utils.http.HttpClient.wrapAndThrowHttpException(HttpClient.java:448) ~[pinot-all-1.2.0-jar-with-dependencies.jar:1.2.0-cc33ac502a02e2fe830fe21e556234ee99351a7a]
at org.apache.pinot.common.utils.FileUploadDownloadClient.sendSegmentUri(FileUploadDownloadClient.java:982) ~[pinot-all-1.2.0-jar-with-dependencies.jar:1.2.0-cc33ac502a02e2fe830fe21e556234ee99351a7a]
at org.apache.pinot.segment.local.utils.SegmentPushUtils.lambda$sendSegmentUris$1(SegmentPushUtils.java:234) ~[pinot-all-1.2.0-jar-with-dependencies.jar:1.2.0-cc33ac502a02e2fe830fe21e556234ee99351a7a]
at org.apache.pinot.spi.utils.retry.BaseRetryPolicy.attempt(BaseRetryPolicy.java:50) ~[pinot-all-1.2.0-jar-with-dependencies.jar:1.2.0-cc33ac502a02e2fe830fe21e556234ee99351a7a]
at org.apache.pinot.segment.local.utils.SegmentPushUtils.sendSegmentUris(SegmentPushUtils.java:231) ~[pinot-all-1.2.0-jar-with-dependencies.jar:1.2.0-cc33ac502a02e2fe830fe21e556234ee99351a7a]
at org.apache.pinot.segment.local.utils.SegmentPushUtils.sendSegmentUris(SegmentPushUtils.java:115) ~[pinot-all-1.2.0-jar-with-dependencies.jar:1.2.0-cc33ac502a02e2fe830fe21e556234ee99351a7a]
at org.apache.pinot.plugin.ingestion.batch.spark3.SparkSegmentUriPushJobRunner$1.call(SparkSegmentUriPushJobRunner.java:128) ~[pinot-batch-ingestion-spark-3-1.2.0-shaded.jar:1.2.0-cc33ac502a02e2fe830fe21e556234ee99351a7a]
at org.apache.pinot.plugin.ingestion.batch.spark3.SparkSegmentUriPushJobRunner$1.call(SparkSegmentUriPushJobRunner.java:118) ~[pinot-batch-ingestion-spark-3-1.2.0-shaded.jar:1.2.0-cc33ac502a02e2fe830fe21e556234ee99351a7a]
at org.apache.spark.api.java.JavaRDDLike.$anonfun$foreach$1(JavaRDDLike.scala:352) ~[spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
at org.apache.spark.api.java.JavaRDDLike.$anonfun$foreach$1$adapted(JavaRDDLike.scala:352) ~[spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
at scala.collection.Iterator.foreach(Iterator.scala:943) ~[scala-library-2.12.17.jar:?]
at scala.collection.Iterator.foreach$(Iterator.scala:943) ~[scala-library-2.12.17.jar:?]
at org.apache.spark.InterruptibleIterator.foreach(InterruptibleIterator.scala:28) ~[spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
at org.apache.spark.rdd.RDD.$anonfun$foreach$2(RDD.scala:1002) ~[spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
at org.apache.spark.rdd.RDD.$anonfun$foreach$2$adapted(RDD.scala:1002) ~[spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
at org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2303) ~[spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:92) ~[spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
at org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161) ~[spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
at org.apache.spark.scheduler.Task.run(Task.scala:139) ~[spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$runWithUgi$3(Executor.scala:589) ~[spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1540) [spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
at org.apache.spark.executor.Executor$TaskRunner.runWithUgi(Executor.scala:592) [spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:518) [spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
at java.base/java.lang.Thread.run(Thread.java:840) [?:?]
The text was updated successfully, but these errors were encountered:
chrajeshbabu
changed the title
Pushing the segment files generated through spark/stnandalone job are failing with "Could not find paramName tableName in path or query params of the API v2/segmments" with s3 as a deep store
Pushing the segment files generated through spark on yarn/stnandalone job are failing with "Could not find paramName tableName in path or query params of the API v2/segmments" with s3 as a deep store
Nov 7, 2024
chrajeshbabu
changed the title
Pushing the segment files generated through spark on yarn/stnandalone job are failing with "Could not find paramName tableName in path or query params of the API v2/segmments" with s3 as a deep store
Pushing the segment files generated through spark on yarn/standalone job are failing with "Could not find paramName tableName in path or query params of the API v2/segmments" with s3 as a deep store
Nov 7, 2024
chrajeshbabu
changed the title
Pushing the segment files generated through spark on yarn/standalone job are failing with "Could not find paramName tableName in path or query params of the API v2/segmments" with s3 as a deep store
Pushing the segment files generated through spark with cluster mode/standalone job are failing with "Could not find paramName tableName in path or query params of the API v2/segmments" with s3 as a deep store
Nov 7, 2024
When s3 as a deep store enabled and loading data through spark or standalone getting following error.
2024/11/07 10:43:41.262 INFO [HttpClient] [Executor task launch worker for task 0.0 in stage 1.0 (TID 31)] Sending request: http://host1:8556/v2/segments to controller: host1.visa.com, version: Unknown
2024/11/07 10:43:41.263 WARN [SegmentPushUtils] [Executor task launch worker for task 0.0 in stage 1.0 (TID 31)] Caught temporary exception while pushing table: airlineStats segment uri: s3a://projects/data/pinot/examples/output/airlineStats/segments/2014/01/01/airlineStats_batch_2014-01-01_2014-01-01.tar.gz to http://host1:8556, will retry
org.apache.pinot.common.exception.HttpErrorStatusException: Got error status code: 500 (Internal Server Error) with reason: "Could not find paramName tableName in path or query params of the API: http://host1:8556/v2/segments" while sending request: http://host1:8556/v2/segments to controller: host1.visa.com, version: Unknown
at org.apache.pinot.common.utils.http.HttpClient.wrapAndThrowHttpException(HttpClient.java:448) ~[pinot-all-1.2.0-jar-with-dependencies.jar:1.2.0-cc33ac502a02e2fe830fe21e556234ee99351a7a]
at org.apache.pinot.common.utils.FileUploadDownloadClient.sendSegmentUri(FileUploadDownloadClient.java:982) ~[pinot-all-1.2.0-jar-with-dependencies.jar:1.2.0-cc33ac502a02e2fe830fe21e556234ee99351a7a]
at org.apache.pinot.segment.local.utils.SegmentPushUtils.lambda$sendSegmentUris$1(SegmentPushUtils.java:234) ~[pinot-all-1.2.0-jar-with-dependencies.jar:1.2.0-cc33ac502a02e2fe830fe21e556234ee99351a7a]
at org.apache.pinot.spi.utils.retry.BaseRetryPolicy.attempt(BaseRetryPolicy.java:50) ~[pinot-all-1.2.0-jar-with-dependencies.jar:1.2.0-cc33ac502a02e2fe830fe21e556234ee99351a7a]
at org.apache.pinot.segment.local.utils.SegmentPushUtils.sendSegmentUris(SegmentPushUtils.java:231) ~[pinot-all-1.2.0-jar-with-dependencies.jar:1.2.0-cc33ac502a02e2fe830fe21e556234ee99351a7a]
at org.apache.pinot.segment.local.utils.SegmentPushUtils.sendSegmentUris(SegmentPushUtils.java:115) ~[pinot-all-1.2.0-jar-with-dependencies.jar:1.2.0-cc33ac502a02e2fe830fe21e556234ee99351a7a]
at org.apache.pinot.plugin.ingestion.batch.spark3.SparkSegmentUriPushJobRunner$1.call(SparkSegmentUriPushJobRunner.java:128) ~[pinot-batch-ingestion-spark-3-1.2.0-shaded.jar:1.2.0-cc33ac502a02e2fe830fe21e556234ee99351a7a]
at org.apache.pinot.plugin.ingestion.batch.spark3.SparkSegmentUriPushJobRunner$1.call(SparkSegmentUriPushJobRunner.java:118) ~[pinot-batch-ingestion-spark-3-1.2.0-shaded.jar:1.2.0-cc33ac502a02e2fe830fe21e556234ee99351a7a]
at org.apache.spark.api.java.JavaRDDLike.$anonfun$foreach$1(JavaRDDLike.scala:352) ~[spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
at org.apache.spark.api.java.JavaRDDLike.$anonfun$foreach$1$adapted(JavaRDDLike.scala:352) ~[spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
at scala.collection.Iterator.foreach(Iterator.scala:943) ~[scala-library-2.12.17.jar:?]
at scala.collection.Iterator.foreach$(Iterator.scala:943) ~[scala-library-2.12.17.jar:?]
at org.apache.spark.InterruptibleIterator.foreach(InterruptibleIterator.scala:28) ~[spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
at org.apache.spark.rdd.RDD.$anonfun$foreach$2(RDD.scala:1002) ~[spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
at org.apache.spark.rdd.RDD.$anonfun$foreach$2$adapted(RDD.scala:1002) ~[spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
at org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2303) ~[spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:92) ~[spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
at org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161) ~[spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
at org.apache.spark.scheduler.Task.run(Task.scala:139) ~[spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$runWithUgi$3(Executor.scala:589) ~[spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1540) [spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
at org.apache.spark.executor.Executor$TaskRunner.runWithUgi(Executor.scala:592) [spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:518) [spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
at java.base/java.lang.Thread.run(Thread.java:840) [?:?]
The text was updated successfully, but these errors were encountered: