Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] SOCAI error 400 #745

Open
kazhuyo opened this issue Jul 18, 2024 · 0 comments
Open

[BUG] SOCAI error 400 #745

kazhuyo opened this issue Jul 18, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@kazhuyo
Copy link

kazhuyo commented Jul 18, 2024

When trying process alert with soc-ai, the docker logs return error 400

Here's the part of error in socai docker container :

request to GPT: status code '400' received '{
  "error": {
    "message": "This model's maximum context length is 16385 tokens. However, your messages resulted in 19181 tokens. Please reduce the length of the messages.",
    "type": "invalid_request_error",
    "param": "messages",
    "code": "context_length_exceeded"
  }
}'

I think the default model for socai is using gpt-3.5-turbo-16k, which is need to change gpt-4-turbo to support above 16K tokens.

@kazhuyo kazhuyo added the bug Something isn't working label Jul 18, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant