Skip to content
Discussion options

You must be logged in to vote

Increasing the generated tokens does allow the response to be longer, but I have to have generated tokens set quite high for a full response, who h reduces the context significantly...

Iirc OR model context lengths can be set in the app. Is there any reason why you can't set your Max Context to say, 32k and response length to 1k? Testing this on my device via OR w/ Gemini 2.5 Pro it seems to work fine.

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@kaibagley
Comment options

Answer selected by kaibagley
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants