Adjust GLM 4.6 Max tokens and add GLM 4.6V to available models from Coding Plan #4337
Chris-Terminator
started this conversation in
1. Feature requests
Replies: 1 comment
-
|
Upvote for this kinda Update, because we need it urgently, and we tested it out as coding model on many different projects, its working wonderfull the new 4.6 V Modell that is been realesed from 3 days ago. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Adjusting GLM 4.6 Parameters
Currently the International Coding plan via the ZAI provider displays and limits the output token max to 98,304 tokens which is the maximum for GLM 4.5
In Z.ai's documentation, they list the output limit at 131,072 tokens
Adding GLM 4.6V as an available model to the ZAI provider
Z AI has jsut released GLM 4.6v as a new model in the lineup and would be the successor to the currently configured GLM 4.5v in the provider code
Would be great to see these changes made! :D


Beta Was this translation helpful? Give feedback.
All reactions