Skip to content

feat: Make AI engine parameters (temperature, num_ctx) configurable #41

@dguembel-itomig

Description

@dguembel-itomig

Problem

Currently, AI engine parameters like temperature and num_ctx are hardcoded in OllamaAIEngine.php:

$oConfig->modelOptions = array(
    'num_ctx' => '16384',
    'temperature' => '0.4',
);

This is not ideal because:

  • temperature is a parameter supported by all AI providers — it should be configurable across engines
  • num_ctx (context window size) is Ollama-specific and may not apply to other providers
  • Users cannot adjust these values without modifying the source code

Proposed Solution

Make these parameters configurable via iTop's configuration system, e.g. through ai_engine.configuration.model_options:

'ai_engine.configuration' => array(
    'url' => 'http://127.0.0.1:11434/api/',
    'api_key' => '',
    'model' => 'qwen2.5:14b',
    'model_options' => array(
        'temperature' => '0.4',
        'num_ctx' => '16384',
    ),
),

Each engine implementation could then read and apply relevant options from this configuration.

Scope

  • Concrete: OllamaAIEngine — currently the only engine with hardcoded model options (removed in v26.1.1)
  • General: All engines could benefit from a standardized way to pass provider-specific parameters

Notes

In v26.1.1, the hardcoded model_options block has been removed from OllamaAIEngine as a cleanup step. This issue tracks the follow-up work to make these parameters properly configurable.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions