Skip to content

morganross/mcp_chatter

Repository files navigation

chatterr-mcp

chatterr-mcp is a local MCP stdio server that provides one tool, chatterr, to run deterministic back-and-forth debates between two local LLMs.

It is packaged to behave like other installable MCP servers (Playwright/Puppeteer style operational flow): install, register command, use in chat.

Quick answer: can VS Code Copilot Chat just point at this repo and run it?

Not by repo path alone. Like other MCP servers, Copilot needs a configured server command.

This repo now provides everything needed:

  • installable Python package (pyproject.toml)
  • console entrypoint (chatterr-server)
  • module entrypoint (python -m chatterr_mcp)
  • MCP sample config (mcp.sample.json)
  • tests

Install

For a dedicated step-by-step install guide, see INSTALL.md.

Option A (recommended): editable local install

python3 -m pip install -e .

Then the server command is available:

chatterr-server

Option B: run from source

PYTHONPATH=src python3 -m chatterr_mcp

VS Code MCP configuration

Use the same pattern as other MCP servers: configure a command.

Example (installed command):

{
  "servers": {
    "chatterr": {
      "command": "chatterr-server",
      "args": [],
      "env": {
        "CHATTER_MODEL_CMD_TEMPLATE": "ollama run {model}"
      }
    }
  }
}

Source-mode template is in mcp.sample.json.

Tool: chatterr

Input schema fields:

  • topic (string)
  • model_a (string)
  • model_b (string)
  • max_turns (integer >= 1)
  • stop_phrase (optional string)
  • timeout_seconds (optional integer, default 60)

Output fields:

  • transcript (string)
  • turns_completed (integer)
  • status (completed | stopped | error)
  • error (optional string)

Model command backend

Default per-call command:

ollama run <model>

Override backend command template:

CHATTER_MODEL_CMD_TEMPLATE='your_command {model}'

Each model call receives full prompt/transcript on stdin and must output plain text on stdout.

Protocol compatibility

This server uses MCP JSON-RPC over stdio with Content-Length framing for compatibility with MCP clients.

Dev checks

python3 -m py_compile src/chatterr_mcp/server.py src/chatterr_mcp/__main__.py chatterr_server.py
PYTHONPATH=src python3 -m unittest discover -s tests -v

About

mcp service for back and forth llm chat

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages