Skip to content

local model support? #18

@innokean

Description

@innokean

Any plans for local model support? Any caveates if I put ollama or vllm here?
https://github.com/principia-ai/WriteHERE/blob/main/recursive/llm/llm.py#L148-L163

Metadata

Metadata

Labels

enhancementNew feature or request

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions