Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

update/fix maximum token count, support 'chatgpt-4o-latest' #565

Merged
merged 1 commit into from
Oct 24, 2024

Conversation

zhujunsan
Copy link
Contributor

and remove some deprecated models

@zhujunsan
Copy link
Contributor Author

@lanqian528
Copy link

其实可以将最大令牌数量这个部分删去,不规定最大令牌数量是最好的,否则每出一个模型都要更新

@zhujunsan
Copy link
Contributor Author

可是,确实有限制诶,不按模型上限打请求会报错的。我也是有用户报错了我才开始维护这个列表的

@lanqian528
Copy link

lanqian528 commented Aug 15, 2024

可是,确实有限制诶,不按模型上限打请求会报错的。我也是有用户报错了我才开始维护这个列表的

我的意思是,其实不传max_tokens这个参数就行了。maxtoken限制的输出token数,不传就默认模型最大窗口。

@zhujunsan
Copy link
Contributor Author

没maxtoken的话没法算最多能带多少上下文进去吧?

@lanqian528
Copy link

没maxtoken的话没法算最多能带多少上下文进去吧?

上下文长度与128k的输入窗口挂钩,maxtoken是限制输出窗口,所以上下文与maxtoken没有关系的。只有输入超128k了才会报错,输出不可能超,因此不传maxtoken不会有任何问题。所以只需要options.maxModelTokens = 128000这是上下文,不需要options.maxResponseTokens = 16384这是指输出限制(max_tokens=16384)

@zhujunsan
Copy link
Contributor Author

@BobDu BobDu merged commit a3f0ba8 into chatgpt-web-dev:main Oct 24, 2024
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants