Skip to content

feat: add MiniMax as a supported LLM provider with M2.7 models#5666

Open
octo-patch wants to merge 2 commits intocomet-ml:mainfrom
octo-patch:feature/add-minimax-integration
Open

feat: add MiniMax as a supported LLM provider with M2.7 models#5666
octo-patch wants to merge 2 commits intocomet-ml:mainfrom
octo-patch:feature/add-minimax-integration

Conversation

@octo-patch
Copy link

@octo-patch octo-patch commented Mar 15, 2026

Summary

Add MiniMax as a supported LLM provider for Opik, with the latest M2.7 flagship model as default.

Changes

  • Add MiniMax temperature clamping filter in LiteLLM util (temperature > 0 requirement)
  • Add MiniMax-M2.7 and MiniMax-M2.7-highspeed as new models (set M2.7 as default)
  • Add MiniMax-M2.5 and M2.5-highspeed as alternative models
  • Add model entries to frontend (providers.ts, useLLMProviderModelsData.ts) and backend (OpenRouterModelName.java)
  • Add model pricing data for M2.7 models in model_prices_and_context_window.json
  • Add comprehensive MiniMax integration documentation (minimax.mdx) with OpenAI SDK and LiteLLM examples
  • Add MiniMax to integrations overview and navigation
  • Update README with MiniMax integration link
  • Add unit tests for temperature clamping behavior

Why

MiniMax-M2.7 is the latest flagship model with enhanced reasoning and coding capabilities, offering a 1M token context window.

Testing

  • All 14 unit tests pass including MiniMax temperature clamping tests
  • Integration tested with MiniMax API

Add MiniMax integration documentation, temperature constraint handling,
and tests to support MiniMax models (MiniMax-M2.5, MiniMax-M2.5-highspeed)
as a first-class provider in Opik.

Changes:
- Add MiniMax integration documentation page (minimax.mdx) with usage
  examples for OpenAI SDK, LiteLLM, and evaluation metrics
- Add MiniMax to the Model Providers section in overview and README
- Add MiniMax navigation entry and redirect in docs.yml
- Add MiniMax-specific temperature filter in util.py (MiniMax requires
  temperature > 0, so zero values are clamped to 0.01)
- Add unit tests for MiniMax temperature constraint handling
@octo-patch octo-patch requested review from a team as code owners March 15, 2026 12:07
@github-actions github-actions bot added documentation Improvements or additions to documentation python Pull requests that update Python code tests Including test files, or tests related like configuration. Python SDK labels Mar 15, 2026
Comment on lines +208 to +212
## Important Notes

- **Temperature**: MiniMax models require temperature to be strictly greater than 0. If you set `temperature=0`, it will be automatically adjusted when using LiteLLM with Opik.
- **API Compatibility**: MiniMax's API is fully compatible with the OpenAI SDK, so any OpenAI-compatible tool or framework will work with MiniMax.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Docs claim MiniMax temperature=0 is auto-adjusted when using LiteLLM with Opik, but the clamp is only applied in LiteLLMChatModel (sdks/python/src/opik/evaluation/models/litellm/util.py) and the Opik monitoring path (sdks/python/src/opik/evaluation/models/litellm/opik_monitor.py lines 20–94) never calls _apply_minimax_filters. Calling litellm.completion with temperature=0 and only Opik callbacks still sends temperature=0 to MiniMax and fails; can we either narrow the doc note to the evaluation-model use case or add the clamp to the LiteLLM monitoring integration?

Finding type: Logical Bugs | Severity: 🔴 High


Want Baz to fix this for you? Activate Fixer

Other fix methods

Fix in Cursor

Prompt for AI Agents:

In apps/opik-documentation/documentation/fern/docs/tracing/integrations/minimax.mdx
around lines 208-212, the note incorrectly states that MiniMax temperature=0 is
automatically adjusted when using LiteLLM with Opik. Change the wording to accurately
reflect the current behavior: state that the automatic clamp only occurs when using the
LiteLLMChatModel evaluation integration, not when using the generic OpikLogger
callbacks; alternatively, implement the clamp in the LiteLLM monitoring integration
(sdks/python/src/opik/evaluation/models/litellm/opik_monitor.py) so OpikLogger also
applies _apply_minimax_filters before sending requests. Make the doc update concise and
unambiguous, or if you implement the code change, add a brief comment in opik_monitor.py
indicating why the clamp is needed for MiniMax compatibility.

Comment on lines +109 to +120
def _apply_minimax_filters(
params: Dict[str, Any],
already_warned: Set[str],
warn: Callable[[str, Any], None],
) -> None:
"""Apply MiniMax specific parameter filters.

MiniMax requires temperature to be strictly greater than 0.
A temperature of 0 is rejected by the API, so we clamp it to a small
positive value to avoid errors.
"""

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

params['temperature'] is treated as optional, but _apply_minimax_filters only clamps values that parse to float and are <= 0.0; float(value) raises for None/non-numeric inputs, so params['temperature'] remains None. This lets MiniMax requests send a nullish temperature and the API rejects them. Can we normalize invalid/null temperatures before returning (e.g., set to 0.01 or drop/log them) so callers can rely on a positive number?

Finding type: Validate nullable inputs | Severity: 🔴 High


Want Baz to fix this for you? Activate Fixer

Other fix methods

Fix in Cursor

Prompt for AI Agents:

In sdks/python/src/opik/evaluation/models/litellm/util.py around lines 109-128, the
_apply_minimax_filters function currently only clamps numeric temperatures <= 0 but
leaves None or non-numeric values unchanged. Change the logic so that if params contains
"temperature" but parsing to float fails or the parsed value is None, set
params["temperature"] = 0.01 (the same clamp used for non-positive numbers) and call
warn(...) to indicate an invalid/null temperature was replaced; also keep the existing
behavior of clamping numeric <= 0 to 0.01. This ensures downstream callers never receive
a null/invalid temperature.

- Add MiniMax-M2.7 and MiniMax-M2.7-highspeed to model lists (frontend,
  backend, OpenRouter enum, pricing JSON)
- Update default model references from M2.5 to M2.7 in docs and tests
- Add M2.7 pricing data (cache_read updated to $0.06/M tokens)
- Keep all previous models as available alternatives
@github-actions github-actions bot added java Pull requests that update Java code Frontend Backend typescript *.ts *.tsx labels Mar 18, 2026
@octo-patch octo-patch changed the title feat: add MiniMax as a supported LLM provider feat: add MiniMax as a supported LLM provider with M2.7 models Mar 18, 2026
from opik.evaluation.models import LiteLLMChatModel

# Create a MiniMax model for evaluation
minimax_model = LiteLLMChatModel(model_name="minimax/MiniMax-M2.7")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we replace the hand‑pasted minimax_model example with the generated example link or expand it to include the checklist-required intent line, a minimal runnable context with an inline behavior comment, and a maintenance note linking to the canonical/generated example and owner?

Finding type: Keep docs accurate | Severity: 🟢 Low


Want Baz to fix this for you? Activate Fixer

Other fix methods

Fix in Cursor

Prompt for AI Agents:

In apps/opik-documentation/documentation/fern/docs/tracing/integrations/minimax.mdx
around line 190, the single-line example minimax_model =
LiteLLMChatModel(model_name="minimax/MiniMax-M2.7") is hand-pasted and missing required
metadata. First search the autogenerated SDK/docs and examples for a canonical/generated
Minimax example and, if found, replace this line with a link to that canonical example
(include the file path/URL and a brief note). If no canonical example exists, expand
this snippet by adding: (1) a one-line intent/trigger sentence immediately above it
describing when to use the snippet, (2) a minimal runnable context (import and variable
assignment) with an inline comment on the same line explaining the observable behavior,
and (3) a maintenance note below pointing to the canonical/generated artifact (or state
“none”), include the owning team, and an update cadence. Ensure the maintenance note
references a repo path or URL rather than vague text.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Backend documentation Improvements or additions to documentation Frontend java Pull requests that update Java code Python SDK python Pull requests that update Python code tests Including test files, or tests related like configuration. typescript *.ts *.tsx

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant