Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] GraphRAG Backend application is crashing #1276

Open
3 of 6 tasks
artem-astafev opened this issue Dec 19, 2024 · 5 comments
Open
3 of 6 tasks

[Bug] GraphRAG Backend application is crashing #1276

artem-astafev opened this issue Dec 19, 2024 · 5 comments
Assignees
Labels
bug Something isn't working

Comments

@artem-astafev
Copy link
Contributor

Priority

P3-Medium

OS type

Ubuntu

Hardware type

GPU-AMD

Installation method

  • Pull docker images from hub.docker.com
  • Build docker images from source

Deploy method

  • Docker compose
  • Docker
  • Kubernetes
  • Helm

Running nodes

Single Node

What's the version?

e18369b, PR https://github.com/opea-project/GenAIExamples/pull/1250/files

Description

GraphRAG backend application(MegaService) is crushing withImportError: cannot import name handle_message' from 'comps.cores.mega.utils (/home/user/GenAIComps/comps/cores/mega/utils.py) error

Reproduce steps

  1. fetch GenAIExample git repo - git clone https://github.com/opea-project/GenAIExamples.git
  2. go to docker_image_build - cd ./GenAIExamples/GraphRAG/docker_image_build/
  3. fetch GenAIComps repo - git clone https://github.com/opea-project/GenAIComps.git
  4. build images with docker compose - docker compose -f build.yaml build
  5. run GraphRAG with docker compose

Raw log

graphrag-backend-server  | Traceback (most recent call last):
graphrag-backend-server  |   File "/home/user/graphrag.py", line 10, in <module>
graphrag-backend-server  |     from comps.cores.mega.utils import handle_message
graphrag-backend-server  | ImportError: cannot import name 'handle_message' from 'comps.cores.mega.utils' (/home/user/GenAIComps/comps/cores/mega/utils.py)
graphrag-backend-server  | Traceback (most recent call last):
graphrag-backend-server  |   File "/home/user/graphrag.py", line 10, in <module>
graphrag-backend-server  |     from comps.cores.mega.utils import handle_message
graphrag-backend-server  | ImportError: cannot import name 'handle_message' from 'comps.cores.mega.utils' (/home/user/GenAIComps/comps/cores/mega/utils.py)
graphrag-backend-server  | Traceback (most recent call last):
graphrag-backend-server  |   File "/home/user/graphrag.py", line 10, in <module>
graphrag-backend-server  |     from comps.cores.mega.utils import handle_message
graphrag-backend-server  | ImportError: cannot import name 'handle_message' from 'comps.cores.mega.utils' (/home/user/GenAIComps/comps/cores/mega/utils.py)
graphrag-backend-server  | Traceback (most recent call last):
graphrag-backend-server  |   File "/home/user/graphrag.py", line 10, in <module>
graphrag-backend-server  |     from comps.cores.mega.utils import handle_message
graphrag-backend-server  | ImportError: cannot import name 'handle_message' from 'comps.cores.mega.utils' (/home/user/GenAIComps/comps/cores/mega/utils.py)
graphrag-backend-server  | Traceback (most recent call last):
graphrag-backend-server  |   File "/home/user/graphrag.py", line 10, in <module>
graphrag-backend-server  |     from comps.cores.mega.utils import handle_message
graphrag-backend-server  | ImportError: cannot import name 'handle_message' from 'comps.cores.mega.utils' (/home/user/GenAIComps/comps/cores/mega/utils.py)
graphrag-backend-server  | Traceback (most recent call last):
graphrag-backend-server  |   File "/home/user/graphrag.py", line 10, in <module>
graphrag-backend-server  |     from comps.cores.mega.utils import handle_message
graphrag-backend-server  | ImportError: cannot import name 'handle_message' from 'comps.cores.mega.utils' (/home/user/GenAIComps/comps/cores/mega/utils.py)
graphrag-backend-server  | Traceback (most recent call last):
graphrag-backend-server  |   File "/home/user/graphrag.py", line 10, in <module>
graphrag-backend-server  |     from comps.cores.mega.utils import handle_message
graphrag-backend-server  | ImportError: cannot import name 'handle_message' from 'comps.cores.mega.utils' (/home/user/GenAIComps/comps/cores/mega/utils.py)
graphrag-backend-server  | Traceback (most recent call last):
graphrag-backend-server  |   File "/home/user/graphrag.py", line 10, in <module>
graphrag-backend-server  |     from comps.cores.mega.utils import handle_message
graphrag-backend-server  | ImportError: cannot import name 'handle_message' from 'comps.cores.mega.utils' (/home/user/GenAIComps/comps/cores/mega/utils.py)
graphrag-backend-server  | Traceback (most recent call last):
graphrag-backend-server  |   File "/home/user/graphrag.py", line 10, in <module>
graphrag-backend-server  |     from comps.cores.mega.utils import handle_message
graphrag-backend-server  | ImportError: cannot import name 'handle_message' from 'comps.cores.mega.utils' (/home/user/GenAIComps/comps/cores/mega/utils.py)
graphrag-backend-server  | Traceback (most recent call last):
graphrag-backend-server  |   File "/home/user/graphrag.py", line 10, in <module>
graphrag-backend-server  |     from comps.cores.mega.utils import handle_message
graphrag-backend-server  | ImportError: cannot import name 'handle_message' from 'comps.cores.mega.utils' (/home/user/GenAIComps/comps/cores/mega/utils.py)
@ZailiWang
Copy link
Collaborator

Thanks for reporting, let me try to reproduce and analyze the issue.

@xiguiw
Copy link
Collaborator

xiguiw commented Dec 20, 2024

@artem-astafev

We cannot reproduce this with latest code.

This version you used was at specific point - it changes the gateway, transfer code between GenAIComps and GenAIExamples (move some code from GenAICom to GenAIExxamples). So you must match both repos.

You version:
e18369b, PR https://github.com/opea-project/GenAIExamples/pull/1250/files

Would you please update you code to latest (both GenAIExamples and GenAIComp) and try again?

@lkk12014402
Copy link
Collaborator

lkk12014402 commented Dec 20, 2024

@artem-astafev
kindly remind: the tgi (text-generation-inference) image is for intel/hpu/gaudi https://github.com/opea-project/GenAIExamples/blob/main/GraphRAG/docker_compose/intel/hpu/gaudi/compose.yaml#L41C1-L43C37. if use other hardware, should change the tgi docker image

@artem-astafev
Copy link
Contributor Author

@artem-astafev

We cannot reproduce this with latest code.

This version you used was at specific point - it changes the gateway, transfer code between GenAIComps and GenAIExamples (move some code from GenAICom to GenAIExxamples). So you must match both repos.

You version: e18369b, PR https://github.com/opea-project/GenAIExamples/pull/1250/files

Would you please update you code to latest (both GenAIExamples and GenAIComp) and try again?

Thank you for the reply, I will check and let you know

@artem-astafev
Copy link
Contributor Author

@artem-astafev kindly remind: the tgi (text-generation-inference) image is for intel/hpu/gaudi https://github.com/opea-project/GenAIExamples/blob/main/GraphRAG/docker_compose/intel/hpu/gaudi/compose.yaml#L41C1-L43C37. if use other hardware, should change the tgi docker image

Thank you for the reminder, we are using pre-build TGI image for ROCm like ghcr.io/huggingface/text-generation-inference:2.3.1-rocm for TGI instances on AMD GPUs.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants