-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Don't show spark UI and Application ID when using Spark 3.5.0 #2
Comments
Hi @duc-dn , Which version of JRE/JDK are you using? Livy Server is currently not compatible with JDK 17. Spark 3.5.0 Docker images are built using JDK 17, and Livy is not compatible with JDK 17. |
Okay, I will test with Spark 3.5.0 and share my findings. |
Hi again. Do you have any updates about above problem?
So follow you, when building Livy with spark 3.2.3, can it run with Spark 3.5.0? |
Hi @duc-dn , I am still working on it. Livy image: asifkhatri/livy:spark3.2.3 with spark: apache/spark:3.5.0-scala2.12-java11-python3-r-ubuntu might work. We need to build a new Livy image with spark 3.5.0 for spark: apache/spark:3.5.0-scala2.12-java11-python3-r-ubuntu. |
I have updated the Helm chart to support Spark 3.5.0. You can use this repository for testing. To build your own Docker image, refer to this commit in the incubator-livy repository. Use the following command to build the Livy code: mvn -e -Pthriftserver -Pscala-2.12 -Pspark3 -Phadoop3 -pl -:livy-python-api clean install For the Docker image, use the following Dockerfile: FROM apache/spark:3.5.0-scala2.12-java11-python3-r-ubuntu
ENV LIVY_VERSION=0.9.0-incubating-SNAPSHOT
ENV LIVY_PACKAGE=apache-livy-${LIVY_VERSION}_2.12-bin
ENV LIVY_HOME=/opt/livy
ENV LIVY_CONF_DIR=/conf
ENV PATH=$PATH:$LIVY_HOME/bin
USER root
COPY $LIVY_PACKAGE.zip /
RUN apt-get update && apt-get install -y unzip && \
unzip /$LIVY_PACKAGE.zip -d / && \
mv /$LIVY_PACKAGE /opt/ && \
rm -rf $LIVY_HOME && \
ln -s /opt/$LIVY_PACKAGE $LIVY_HOME && \
rm -f /$LIVY_PACKAGE.zip
RUN mkdir /var/log/livy && \
ln -s /var/log/livy $LIVY_HOME/logs
WORKDIR $LIVY_HOME
ENTRYPOINT ["livy-server"] |
Hi @askhatri, I cloned this branch:
|
Hi @duc-dn , It appears that there are some test failures. I will work on fixing them. In the meantime, you can use the -DskipITs -DskipTests flag to skip the tests as a workaround. |
@askhatri, which version mvn and java do you use to build LIVY? |
I am using "Apache Maven 3.3.9" and "Java version: 1.8.0_292". Same configuration that is used at https://github.com/apache/incubator-livy/actions/runs/9871334539/job/27259040026. |
thanks @askhatri, let me check |
@askhatri, I changed mvn 3.3.9 java1.8.0 but I faced same above error
So is this problem due to the version of jackson?? I don't know why you don't face this problem |
I'm not sure about the Jackson version. This is new to me. |
Hi @askhatri, I found your latest update. Thanks
In log of livy server. I don't see raise any error about LDAP, but livy server is restarted continuously
|
I haven't integrated LDAP with Livy yet, but I'll attempt to do so and validate it. We'll need a running LDAP server and will configure Livy to use it with |
Sorry I copied it wrongly as localhost. I deployed LDAP in same namespace with Livy Server |
ok @duc-dn , I will also try the same from my end. |
Hi again @askhatri. Do you have any updates about LDAP? |
Hi @duc-dn , I’m still working on the LDAP integration and will update you as soon as I have any progress to share. Currently, the Livy server doesn’t support scaled instances, so we’ll need to implement this high availability (HA) feature for Livy. |
Hi @askhatri, I built a snapshot master branch of LIVY with spark 3.5.0
After I replace them in helm chart, it worked well. However, when I checked LIVY UI, it doesn't show spark ui and application id.
I checked endpoint get /sessions, seem to not get info of spark session
I forward port of spark driver and can see spark UI of session.
In addition, when deploying with your image, I still see sparkUI and applicationID (build image following in
Docker.md
)So Is this due to Livy Server (currently) not being compatible with Spark 3.5.0?
Thanks
The text was updated successfully, but these errors were encountered: