Skip to content

Conversation

@zerone0x
Copy link
Contributor

Summary

Fixes #8291

The openai/gpt-5.1-codex-max model is available on OpenRouter's API but was not included in the models.dev database, making it unavailable to OpenCode users connecting via OpenRouter.

Changes

  • Added dynamic model injection for openai/gpt-5.1-codex-max to the OpenRouter provider
  • Model is only added if the base openai/gpt-5.1-codex model exists (to ensure we have a valid API configuration to inherit from)
  • Follows the same pattern used for GitHub Copilot model handling

Model specifications:

  • Context limit: 400,000 tokens
  • Output limit: 128,000 tokens
  • Supports: reasoning, tool calls, image input
  • Pricing: $1.1/M input, $9/M output, $0.11/M cache read

Test plan

  • Verify model appears in /models when using OpenRouter provider
  • Verify model is selectable and usable for chat
  • Verify existing OpenRouter models continue to work

🤖 Generated with Claude Code

The openai/gpt-5.1-codex-max model is available in OpenRouter's API
but was not included in the models.dev database. This adds the model
dynamically to the OpenRouter provider, similar to how github-copilot
models are handled.

Fixes anomalyco#8291

Co-Authored-By: Claude <[email protected]>
@github-actions
Copy link
Contributor

The following comment was made by an LLM, it may be inaccurate:

No duplicate PRs found

write: 0,
},
},
limit: {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this should be added to models.dev

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

gpt-5.1-codex-max is missing in the OpenRouter connector

2 participants