LLM provider credentials are currently owned by the chat plugin — hardcoded to one key per provider, with no user-facing label, no sharing across plugins, and no platform-level place to manage them. This issue introduces a first-class LLMConfig resource: a named, per-user bundle of provider + model + API key that any plugin can reference by ID.
Problem
- Credentials live inside
app/core_plugins/chat/ — any new plugin needing LLM access must duplicate or couple to chat
- Users can only have one active key per provider; no way to label or distinguish keys
- There is no management UI — API keys are buried inside individual plugin config forms
Solution
- Extract shared LLM infrastructure (
providers.py, encryption.py, cache.py) out of the chat plugin into a new app/llm/ module
- Introduce
LLMConfig — a named, soft-deletable credential row owned by the user
- Expose full CRUD at
/api/v1/llm/configs
- Add a dedicated LLM Configs settings page where users manage all their credentials in one place
- Update plugin config forms (chat, Slack) to select a config from a dropdown instead of entering raw API keys
LLM provider credentials are currently owned by the
chatplugin — hardcoded to one key per provider, with no user-facing label, no sharing across plugins, and no platform-level place to manage them. This issue introduces a first-class LLMConfig resource: a named, per-user bundle of provider + model + API key that any plugin can reference by ID.Problem
app/core_plugins/chat/— any new plugin needing LLM access must duplicate or couple to chatSolution
providers.py,encryption.py,cache.py) out of the chat plugin into a newapp/llm/moduleLLMConfig— a named, soft-deletable credential row owned by the user/api/v1/llm/configs