Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failed to connect to context #458

Open
jhao104 opened this issue Sep 29, 2024 · 0 comments
Open

Failed to connect to context #458

jhao104 opened this issue Sep 29, 2024 · 0 comments

Comments

@jhao104
Copy link

jhao104 commented Sep 29, 2024

livy log

24/09/29 18:17:21 WARN [ContextLauncher-1] ContextLauncher: Child process exited with code 1.
24/09/29 18:17:21 ERROR [RPC-Handler-6] RSCClient: Failed to connect to context.
java.io.IOException: Child process exited with code 1.
        at org.apache.livy.rsc.ContextLauncher$ChildProcess$1.run(ContextLauncher.java:397)
        at org.apache.livy.rsc.ContextLauncher$ChildProcess$2.run(ContextLauncher.java:448)
        at java.lang.Thread.run(Thread.java:750)
24/09/29 18:17:21 INFO [RPC-Handler-6] RSCClient: Failing pending job d501f792-6a0c-47e2-aff6-d6ba04d50f97 due to shutdown.
24/09/29 18:17:21 INFO [scala-execution-context-global-63] InteractiveSession: Stopping InteractiveSession 11787...
24/09/29 18:17:21 INFO [RPC-Handler-6] InteractiveSession: Failed to ping RSC driver for session 11787. Killing application.
24/09/29 18:17:22 INFO [RPC-Handler-6] RSCClient: Job d501f792-6a0c-47e2-aff6-d6ba04d50f97 already failed.
java.lang.IllegalStateException: spark-submit start failed
        at org.apache.livy.utils.SparkYarnApp.getAppIdFromTag(SparkYarnApp.scala:196)
        at org.apache.livy.utils.SparkYarnApp.$anonfun$yarnAppMonitorThread$3(SparkYarnApp.scala:271)
        at scala.Option.getOrElse(Option.scala:189)
        at org.apache.livy.utils.SparkYarnApp.$anonfun$yarnAppMonitorThread$1(SparkYarnApp.scala:268)
        at org.apache.livy.Utils$$anon$1.run(Utils.scala:97)
24/09/29 18:17:22 ERROR [yarnAppMonitorThread-org.apache.livy.utils.SparkYarnApp@7433857e] SparkYarnApp: Error whiling refreshing YARN state
java.lang.IllegalStateException: spark-submit start failed
        at org.apache.livy.utils.SparkYarnApp.getAppIdFromTag(SparkYarnApp.scala:196)
        at org.apache.livy.utils.SparkYarnApp.$anonfun$yarnAppMonitorThread$3(SparkYarnApp.scala:271)
        at scala.Option.getOrElse(Option.scala:189)
        at org.apache.livy.utils.SparkYarnApp.$anonfun$yarnAppMonitorThread$1(SparkYarnApp.scala:268)
        at org.apache.livy.Utils$$anon$1.run(Utils.scala:97)
24/09/29 18:17:22 WARN [scala-execution-context-global-71] InteractiveSession: Fail to get rsc uri
java.util.concurrent.ExecutionException: java.io.IOException: Child process exited with code 1.
        at io.netty.util.concurrent.DefaultPromise.get(DefaultPromise.java:351)
        at org.apache.livy.server.interactive.InteractiveSession.$anonfun$start$5(InteractiveSession.scala:474)
        at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:659)
        at scala.util.Success.$anonfun$map$1(Try.scala:255)
        at scala.util.Success.map(Try.scala:213)
        at scala.concurrent.Future.$anonfun$map$1(Future.scala:292)
        at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33)
        at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
        at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
        at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1402)
        at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
        at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
        at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
        at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.io.IOException: Child process exited with code 1.
        at org.apache.livy.rsc.ContextLauncher$ChildProcess$1.run(ContextLauncher.java:397)
        at org.apache.livy.rsc.ContextLauncher$ChildProcess$2.run(ContextLauncher.java:448)
        at java.lang.Thread.run(Thread.java:750)

session log

24/09/29 18:17:21 INFO Client: Deleted staging directory hdfs://**:9000/user/livy/.sparkStaging/application_1727350308711_11779
Exception in thread "main" java.lang.IllegalArgumentException: Attempt to add (file:///etc/taihao-apps/spark-conf/hive-site.xml) multiple times to the distributed cache.
	at org.apache.spark.deploy.yarn.Client.$anonfun$prepareLocalResources$26(Client.scala:717)
	at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
	at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
	at org.apache.spark.deploy.yarn.Client.$anonfun$prepareLocalResources$25(Client.scala:707)
	at org.apache.spark.deploy.yarn.Client.$anonfun$prepareLocalResources$25$adapted(Client.scala:706)
	at scala.collection.immutable.List.foreach(List.scala:431)
	at org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:706)
	at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:983)
	at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:220)
	at org.apache.spark.deploy.yarn.Client.run(Client.scala:1310)
	at org.apache.spark.deploy.yarn.YarnClusterApplication.start(Client.scala:1758)
	at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:1020)
	at org.apache.spark.deploy.SparkSubmit$$anon$1.run(SparkSubmit.scala:176)
	at org.apache.spark.deploy.SparkSubmit$$anon$1.run(SparkSubmit.scala:174)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
	at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:174)
	at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:215)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1111)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1120)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
24/09/29 18:17:21 INFO ShutdownHookManager: Shutdown hook called

livy: 0.8.0 spark 3.4.2

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant