Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not able to build opea/llm-tgi:latest while running the Translation example #1344

Open
3 of 6 tasks
ShankarRIntel opened this issue Jan 4, 2025 · 3 comments
Open
3 of 6 tasks
Assignees
Labels
aitce bug Something isn't working

Comments

@ShankarRIntel
Copy link

Priority

P1-Stopper

OS type

Ubuntu

Hardware type

Gaudi2

Installation method

  • Pull docker images from hub.docker.com
  • Build docker images from source

Deploy method

  • Docker compose
  • Docker
  • Kubernetes
  • Helm

Running nodes

Single Node

What's the version?

v1.1

Description

Not able to build docker because dockerfile is not present

Reproduce steps

sratnesh@fm2r81s1gaudi:~/workspace/GenAIComps$ docker build -t opea/llm-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/llms/src/text-generation/Dockerfile .
[+] Building 0.0s (0/0) docker:default
ERROR: resolve : lstat comps/llms/src: no such file or directory

Raw log

sratnesh@fm2r81s1gaudi:~/workspace/GenAIComps$ docker build -t opea/llm-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/llms/src/text-generation/Dockerfile .
[+] Building 0.0s (0/0)                                                                                                                                                                                   docker:default
ERROR: resolve : lstat comps/llms/src: no such file or directory
@ShankarRIntel ShankarRIntel added the bug Something isn't working label Jan 4, 2025
@ShankarRIntel ShankarRIntel changed the title Not able to build opea/llm-tgi:latest Not able to build opea/llm-tgi:latest while running the Translation example Jan 4, 2025
@xiguiw
Copy link
Collaborator

xiguiw commented Jan 6, 2025

Raw log

sratnesh@fm2r81s1gaudi:~/workspace/GenAIComps$ docker build -t opea/llm-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/llms/src/text-generation/Dockerfile .
[+] Building 0.0s (0/0)                                                                                                                                                                                   docker:default
ERROR: resolve : lstat comps/llms/src: no such file or directory

@ShankarRIntel

The Dockerfile file is here.
https://github.com/opea-project/GenAIComps/blob/6419ace56c8ca0c8f3ed8ea66d1ee7481bc85dc0/comps/llms/src/text-generation/Dockerfile#L6

The code structure is updated recently.
Please update your repo and try again.
Please update Both GenAIExamples and GenAIComps

@ShankarRIntel
Copy link
Author

translation_errors.log
I was able to bring up the servers, but not able to connect to the UI. Errors are attached in the translation_errors.log

@xiguiw
Copy link
Collaborator

xiguiw commented Jan 13, 2025

@ShankarRIntel

From the following log, it indicates the llm_endpoint is not set.

Let me check the code refactor and documents alignment.

�[32mllm-tgi-gaudi-server              | �[0m    self.client = self._initialize_client()
�[32mllm-tgi-gaudi-server              | �[0m                  ^^^^^^^^^^^^^^^^^^^^^^^^^
�[32mllm-tgi-gaudi-server              | �[0m  File "/home/user/comps/llms/src/text-generation/integrations/opea.py", line 74, in _initialize_client
�[32mllm-tgi-gaudi-server              | �[0m    return AsyncOpenAI(api_key=OPENAI_API_KEY, base_url=llm_endpoint + "/v1", timeout=600, default_headers=headers)
�[32mllm-tgi-gaudi-server              | �[0m                                                        ~~~~~~~~~~~~~^~~~~~~
�[32mllm-tgi-gaudi-server              | �[0mTypeError: unsupported operand type(s) for +: 'NoneType' and 'str'
�[32mllm-tgi-gaudi-server              | �[0mTraceback (most recent call last):
�[36mtgi-gaudi-server                  | �[0m    shard_uds_path: "/tmp/text-generation-server",

The logs are from the latest code.

@ShankarRIntel

Did you "Pull docker images from hub.docker.com" or " Build docker images from source"?
Please describe how you set up the environment, or the document you follow to set up the environment.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
aitce bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants