Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Empty string when calling cat.llm() with Ollama and Llama3.3 #999

Open
danieledav opened this issue Dec 29, 2024 · 3 comments
Open
Labels
bug Something isn't working

Comments

@danieledav
Copy link

Describe the bug
I get an empty string when calling cat.llm() with Ollama and Llama3.3 as LLM configuration.
Additional infos:

  • The behavior is normal with Ollama and Llama3
  • The cat.llm() is called in a custom plugin that use "before_rabbithole_stores_documents" hook

To Reproduce
Steps to reproduce the behavior:

  1. Create a custom plugin using any Rabbit Hole's hooks that use cat.llm()
  2. Try to injest a document
  3. In the console see that the results of calling cat.llm() (with any prompt) is empty

Expected behavior
Have any filled string as response of LLM to the prompt passed to cat.llm()

Additional context
Here the code snippet where the call.llm() is called:

@hook  # default priority = 1
def before_rabbithole_stores_documents(docs, cat):
    # summarize group of 5 documents and add them along original ones
    entire_doc = "\n".join([doc.page_content for doc in docs])
    author = cat.llm(f"""Chi è l'autore o l'organizzazione che ha emesso il seguente documento?
                     Rispondi in maniera concisa, se non riesci a capirlo non rispondere nulla.
                     Non includere date ma solo il nome della persona o dell'organizzazione: {entire_doc}
                     """)
    summary = cat.llm(f"Fai un riassunto breve e conciso del seguente testo: {entire_doc}")

Here the console in debug mode with Ollama and Llama3.3:

 ==================== before_rabbithole_stores_documents prompt ====================
cheshire_cat_core  | SystemMessage
cheshire_cat_core  | Chi è l'autore o l'organizzazione che ha emesso il seguente documento?
cheshire_cat_core  |                      Rispondi in maniera concisa, se non riesci a capirlo non rispondere nulla.
cheshire_cat_core  |                      Non includere date ma solo il nome della persona o dell'organizzazione: Come puoi

[...]

========================================
cheshire_cat_core  | 
cheshire_cat_core  | 
cheshire_cat_core  | ==================== before_rabbithole_stores_documents prompt output ====================
cheshire_cat_core  | 
cheshire_cat_core  | ========================================

Here the console in debug mode with Ollama and Llama3:

==================== before_rabbithole_stores_documents prompt ====================
cheshire_cat_core  | SystemMessage
cheshire_cat_core  | Chi è l'autore o l'organizzazione che ha emesso il seguente documento?
cheshire_cat_core  |                      Rispondi in maniera concisa, se non riesci a capirlo non rispondere nulla.
cheshire_cat_core  |                      Non includere date ma solo il nome della persona o dell'organizzazione: Come puoi

[...]

========================================
cheshire_cat_core  | 
cheshire_cat_core  | 
cheshire_cat_core  | ==================== before_rabbithole_stores_documents prompt output ====================
cheshire_cat_core  | L'autore del documento è [...]
cheshire_cat_core  | ========================================
@danieledav danieledav added the bug Something isn't working label Dec 29, 2024
@danieledav
Copy link
Author

Additional context
Chatting through web-socket works fine with Llama3.3.
The bug shows up only when calling cat.llm()

@danieledav
Copy link
Author

Below is the solution I propose. It works but needs to be tested more widely: in llm() method of stray_cat replace SystemMessage with HumanMessage. (Ref. line 296 of core/cat/looking_glass/stray_cat.py)

Here's the code:

prompt = ChatPromptTemplate(
            messages=[
                HumanMessage(content=prompt)
            ]
        )

@pieroit
Copy link
Member

pieroit commented Jan 3, 2025

@danieledav thanks, can you try if the same bug happens on the develop branch?
We made some changes on how we use langchain

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants