-
Notifications
You must be signed in to change notification settings - Fork 209
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug] GraphRAG Backend application is crashing #1276
Comments
Thanks for reporting, let me try to reproduce and analyze the issue. |
We cannot reproduce this with latest code. This version you used was at specific point - it changes the gateway, transfer code between GenAIComps and GenAIExamples (move some code from GenAICom to GenAIExxamples). So you must match both repos. You version: Would you please update you code to latest (both GenAIExamples and GenAIComp) and try again? |
@artem-astafev |
Thank you for the reply, I will check and let you know |
Thank you for the reminder, we are using pre-build TGI image for ROCm like ghcr.io/huggingface/text-generation-inference:2.3.1-rocm for TGI instances on AMD GPUs. |
Priority
P3-Medium
OS type
Ubuntu
Hardware type
GPU-AMD
Installation method
Deploy method
Running nodes
Single Node
What's the version?
e18369b, PR https://github.com/opea-project/GenAIExamples/pull/1250/files
Description
GraphRAG backend application(MegaService) is crushing withImportError:
cannot import name handle_message' from 'comps.cores.mega.utils (/home/user/GenAIComps/comps/cores/mega/utils.py)
errorReproduce steps
Raw log
The text was updated successfully, but these errors were encountered: