meilisearch tracing_actix_web::middleware error_code: "document_not_found" #5051
-
What happened?Attached a PDF to a thread, RAG is sucesfull, but meilisearch is not able to find it. This happened to multiple deployments, and so far not able to fix local RAG without using "forcePrompt: true". Models tested all have Tools. Steps to Reproduce1 - Attach a file to a Thread What browsers are you seeing the problem on?No response Relevant log outputLibreChat | {"file_id":"dcdcb57f-d039-4295-bbe2-c5e9e909b660","filename":"Products.pdf","known_type":true,"level":"debug","message":"File processed successfully.","status":true,"timestamp":"2024-12-19T18:31:09.908Z"}
LibreChat | {"artifactsPrompt":"The assistant can create and reference artifacts during conversations.\n\nArtifacts are for substantial, self-contained content that users might modify or reuse, displayed in a separate UI window for clarity.\n\n# Good artifacts are...\n- Substantial content (...","attachments":{},"conversationId":"d2687570-f261-4380-af62-7956d6ce4120","endpoint":"Secure AI","endpointType":"custom","level":"debug","message":"[AskController]","modelOptions":{"model":"llama3.3:latest"},"modelsConfig":"exists","resendFiles":true,"text":"Summarize document","timestamp":"2024-12-19T18:32:15.016Z"}
LibreChat | {"conversationId":"d2687570-f261-4380-af62-7956d6ce4120","level":"debug","message":"[BaseClient] Loading history:","parentMessageId":"77005ca0-869b-43f3-b450-4837eeeed9c6","timestamp":"2024-12-19T18:32:15.131Z"}
LibreChat | {"level":"debug","message":"[BaseClient] instructions tokenCount: 6099","timestamp":"2024-12-19T18:32:15.267Z"}
LibreChat | {"level":"debug","maxContextTokens":127500,"message":"[BaseClient] Context Count (1/2)","remainingContextTokens":121390,"timestamp":"2024-12-19T18:32:15.267Z"}
LibreChat | {"level":"debug","maxContextTokens":127500,"message":"[BaseClient] Context Count (2/2)","remainingContextTokens":121390,"timestamp":"2024-12-19T18:32:15.267Z"}
LibreChat | {"55f693c3-16f9-4f02-bf8c-de5f2ef1a70b":8,"level":"debug","message":"[BaseClient] tokenCountMap:","timestamp":"2024-12-19T18:32:15.268Z"}
LibreChat | {"level":"debug","maxContextTokens":127500,"message":"[BaseClient]","payloadSize":2,"promptTokens":6110,"remainingContextTokens":121390,"timestamp":"2024-12-19T18:32:15.268Z"}
LibreChat | {"55f693c3-16f9-4f02-bf8c-de5f2ef1a70b":8,"instructions":6099,"level":"debug","message":"[BaseClient] tokenCountMap","timestamp":"2024-12-19T18:32:15.268Z"}
LibreChat | {"conversationId":"d2687570-f261-4380-af62-7956d6ce4120","isCreatedByUser":true,"level":"debug","message":"[BaseClient] userMessage","messageId":"55f693c3-16f9-4f02-bf8c-de5f2ef1a70b","parentMessageId":"77005ca0-869b-43f3-b450-4837eeeed9c6","sender":"User","text":"Summarize document","timestamp":"2024-12-19T18:32:15.269Z","tokenCount":8}
LibreChat | {"baseURL":"http://10.64.73.1:11434/v1","level":"debug","message":"[OpenAIClient] chatCompletion","modelOptions":{"messages":[{"content":"Instructions:\nThe user has attached a file to the conversation:\n \n <file>\n <filename>ASI AI Products.pdf</filename>\n <type>application/pdf</type>\n </file>\n\n A semantic search was execut...","role":"system"},{"content":"Summarize document","role":"user"}],"model":"llama3.3:latest","stream":true,"user":"67634c39cb816208ee73c74c"},"timestamp":"2024-12-19T18:32:15.271Z"}
LibreChat | {"level":"debug","message":"Making request to http://10.64.73.1:11434/v1/chat/completions","timestamp":"2024-12-19T18:32:15.276Z"}
LibreChat | {"level":"debug","message":"[saveConvo] api/app/clients/BaseClient.js - saveMessageToDatabase #saveConvo","timestamp":"2024-12-19T18:32:15.281Z"}
chat-meilisearch | 2024-12-19T18:32:15.289031Z WARN HTTP request{method=GET host="meilisearch:7700" route=/indexes/convos/documents/d2687570-f261-4380-af62-7956d6ce4120 query_parameters= user_agent=node status_code=404 error=Document `d2687570-f261-4380-af62-7956d6ce4120` not found.}: tracing_actix_web::middleware: Error encountered while processing the incoming HTTP request: ResponseError { code: 404, message: "Document `d2687570-f261-4380-af62-7956d6ce4120` not found.", error_code: "document_not_found", error_type: "invalid_request", error_link: "https://docs.meilisearch.com/errors#document_not_found" }
chat-meilisearch | 2024-12-19T18:32:15.289134Z INFO HTTP request{method=GET host="meilisearch:7700" route=/indexes/convos/documents/d2687570-f261-4380-af62-7956d6ce4120 query_parameters= user_agent=node status_code=404 error=Document `d2687570-f261-4380-af62-7956d6ce4120` not found.}: meilisearch: close time.busy=289µs time.idle=267µs ScreenshotsCode of Conduct
|
Beta Was this translation helpful? Give feedback.
Replies: 3 comments
-
There is no relevant error present, meilisearch is not used for RAG. It seems the LLM is ignoring the initial message. I can see the RAG response is successful here:
I noticed the |
Beta Was this translation helpful? Give feedback.
-
@danny-avila I am still facing the issue with Ollama not receiving the embedding with local RAG. Please review it again as I am quite frustrated now, and I'm a paying customer, albeit for code execution. See another example bellow where embedding worked fine, but Ollama is not receiving the data. I've tried many diferent models. I very much appreciate everything you are doing and your support! Please let me know know should you need any aditional information. |
Beta Was this translation helpful? Give feedback.
-
I’ve finally resolved my issue with running local RAG and external Ollama on a Docker container on another CPU powered host. I thought I’d share my configuration with anyone facing similar problems. The scenario involves using Librechat with RAG in containers, while Ollama runs on another GPU-powered server that’s also running in a container. On Ollama server here is my final docker-compose config:
librechat.yaml
docker-composer.override.yml
|
Beta Was this translation helpful? Give feedback.
I’ve finally resolved my issue with running local RAG and external Ollama on a Docker container on another CPU powered host. I thought I’d share my configuration with anyone facing similar problems. The scenario involves using Librechat with RAG in containers, while Ollama runs on another GPU-powered server that’s also running in a container.
On Ollama server here is my final docker-compose config: