PRD: Rust Rewrite of Ralph TUI — AllFrame + Rig + AllSource Core
Problem Statement
Ralph TUI is a TypeScript/Bun AI agent loop orchestrator that manages autonomous coding agents through task backlogs. Its current architecture has limitations:
- Task storage is file-based (JSON, Beads/JSONL, SQLite) — no durability guarantees, no queryable history, no multi-agent collaboration
- Agent integration shells out to CLI tools (Claude Code, OpenCode) and scrapes stdout for completion markers — fragile, no structured tool use
- No native AI capabilities — the orchestrator itself can't reason about tasks, generate prompts intelligently, or evaluate outputs semantically
- TypeScript runtime overhead — Bun is fast for JS but can't match native Rust for a TUI that manages concurrent subprocesses
Proposed Solution
Rewrite ralph-tui in Rust using three AllSource ecosystem crates plus rig:
| Crate |
Role |
| allframe |
Web framework with CQRS + Event Sourcing, compile-time DI, CommandBus, Projections, Sagas |
| allframe-mcp |
Expose task management as MCP tools — any LLM can manage the backlog |
| allsource-core |
Embedded event store backend for AllFrame's CQRS (WAL + Parquet durability) |
| rig-core |
LLM agent framework — multi-provider tool calling, streaming, structured extraction |
The key insight: AllFrame's CQRS module already has an AllSource backend (AllSourceBackend<E> with cqrs-allsource feature). This means we don't wire AllSource manually — we define Aggregates, Commands, and Events using AllFrame's traits, and AllSource provides the durable event store underneath.
Architecture
┌──────────────────────────────────────────────────────────────┐
│ ralph (Rust binary) │
│ │
│ ┌────────────┐ ┌─────────────────────┐ ┌──────────────┐ │
│ │ TUI Layer │ │ AllFrame CQRS │ │ Agent Engine │ │
│ │ (ratatui) │ │ │ │ (rig) │ │
│ │ │ │ CommandBus │ │ │ │
│ │ Task list │ │ Aggregates │ │ Planner │ │
│ │ Agent view │ │ Projections │ │ Coder │ │
│ │ Analytics │ │ Sagas │ │ Evaluator │ │
│ └─────┬──────┘ │ ProjectionRegistry │ └──────┬───────┘ │
│ │ └──────────┬──────────┘ │ │
│ │ │ │ │
│ │ ┌──────────▼──────────┐ │ │
│ │ │ AllSource Backend │ │ │
│ │ │ (WAL + Parquet) │◄────────┘ │
│ │ └──────────┬──────────┘ tool calls │
│ │ │ emit commands │
│ ┌─────▼────────────────────▼─────────────────────────┐ │
│ │ AllFrame MCP Server │ │
│ │ Exposes task ops as LLM-callable tools over stdio │ │
│ └────────────────────────────────────────────────────┘ │
└──────────────────────────────────────────────────────────────┘
Four layers
- TUI Layer (
ratatui) — Terminal UI with task list, agent output streaming, progress panels
- CQRS Layer (
allframe) — CommandBus dispatches commands → Aggregates validate → Events emitted → Projections materialize read models → Sagas coordinate multi-step workflows
- Storage Layer (
allsource-core via AllSourceBackend) — Events durably stored in WAL + Parquet. DashMap for μs reads. EventQL for analytics queries.
- Agent Layer (
rig-core) — LLM agents for planning, coding, evaluation. Tool calls dispatch commands through the CommandBus.
Why AllFrame matters
Without AllFrame, you'd manually wire: command validation → event emission → AllSource ingestion → projection updates → subscription notifications. AllFrame's CQRS module handles all of this declaratively:
// Define the domain
#[derive(Clone, Serialize, Deserialize)]
enum TaskEvent {
Created { id: String, title: String, priority: u8, depends_on: Vec<String> },
Assigned { agent_id: String, session_id: String },
Completed { passes: bool, summary: String, duration_ms: u64 },
Failed { error: String, retry_count: u8 },
}
impl allframe::cqrs::Event for TaskEvent {}
#[derive(Default)]
struct TaskAggregate {
id: String,
status: TaskStatus,
assigned_to: Option<String>,
}
impl allframe::cqrs::Aggregate for TaskAggregate {
type Event = TaskEvent;
fn apply_event(&mut self, event: &TaskEvent) {
match event {
TaskEvent::Created { id, .. } => { self.id = id.clone(); self.status = TaskStatus::Open; }
TaskEvent::Assigned { agent_id, .. } => { self.assigned_to = Some(agent_id.clone()); self.status = TaskStatus::InProgress; }
TaskEvent::Completed { .. } => { self.status = TaskStatus::Completed; }
TaskEvent::Failed { .. } => { self.status = TaskStatus::Failed; }
}
}
}
// Projection: backlog read model
struct BacklogProjection {
tasks: HashMap<String, TaskView>,
}
impl allframe::cqrs::Projection for BacklogProjection {
type Event = TaskEvent;
fn apply(&mut self, event: &TaskEvent) {
// Update materialized view
}
}
// Wire it up with AllSource backend
let backend = AllSourceBackend::<TaskEvent>::production(".ralph/data")?;
let event_store = EventStore::with_backend(backend);
let command_bus = CommandBus::new();
// Register projections for automatic updates
let mut registry = ProjectionRegistry::new(event_store.clone());
registry.register(BacklogProjection::default());
AllFrame MCP: LLM-accessible task management
AllFrame-MCP auto-exposes handlers as MCP tools. This means any LLM (Claude Code, Cursor, etc.) can query and manage the ralph backlog:
use allframe_mcp::McpServer;
let mut router = Router::new();
router.register("list_tasks", list_tasks_handler);
router.register("create_task", create_task_handler);
router.register("complete_task", complete_task_handler);
router.register("task_history", task_history_handler);
let mcp = McpServer::with_router(router);
mcp.serve_stdio().await?; // Any MCP client can now manage tasks
This is a differentiator no other ralph-tui tracker has: the task backlog is natively accessible to LLMs as MCP tools, not just to the orchestrator's hardcoded agent integration.
Rig agents dispatch AllFrame commands
Instead of rig tools directly calling AllSource, they dispatch commands through AllFrame's CommandBus — getting validation, event emission, projection updates, and saga coordination for free:
#[derive(Deserialize, Serialize)]
struct CompleteTaskTool {
command_bus: Arc<CommandBus>,
}
impl rig::tool::Tool for CompleteTaskTool {
const NAME: &'static str = "complete_task";
type Args = CompleteTaskArgs;
type Output = String;
type Error = anyhow::Error;
async fn definition(&self, _prompt: String) -> ToolDefinition {
ToolDefinition {
name: "complete_task".into(),
description: "Mark a task as completed with a summary".into(),
parameters: json!({
"type": "object",
"properties": {
"task_id": { "type": "string" },
"summary": { "type": "string" },
"files_changed": { "type": "array", "items": { "type": "string" } }
},
"required": ["task_id", "summary"]
}),
}
}
async fn call(&self, args: Self::Args) -> Result<String> {
// Dispatch through CommandBus → validates → emits TaskEvent::Completed
// → AllSource persists → projections update → sagas trigger
self.command_bus.dispatch(CompleteTask {
task_id: args.task_id.clone(),
summary: args.summary,
files_changed: args.files_changed,
}).await.map_err(|e| anyhow::anyhow!(e))?;
Ok(format!("Task {} completed", args.task_id))
}
}
Saga: Task Completion Cascade
When a task completes, a saga checks if all siblings in the epic are done, and if so, completes the epic:
impl allframe::cqrs::Saga for EpicCompletionSaga {
type Event = TaskEvent;
fn handle(&self, event: &TaskEvent) -> Vec<Box<dyn Command>> {
if let TaskEvent::Completed { .. } = event {
// Check if all tasks in epic are complete
// If so, emit CompleteEpic command
}
vec![]
}
}
Event Schema
All events use entity_id = task-{id} and tenant_id = {project-name}.
| Event Type |
Payload |
Trigger |
TaskCreated |
{id, title, description, acceptance_criteria, priority, depends_on, labels, epic_id} |
Planner agent |
TaskAssigned |
{agent_id, session_id} |
SELECT phase |
TaskStarted |
{agent_type, model} |
EXECUTE phase |
TaskCompleted |
{passes: true, summary, duration_ms, files_changed} |
Evaluator confirms |
TaskFailed |
{error, retry_count, will_retry} |
Evaluator detects failure |
TaskBlocked |
{blocked_by: [task_ids]} |
Dependency not met |
EpicCreated |
{id, title, prd_ref} |
PRD conversion |
EpicCompleted |
{child_count, duration_ms} |
Saga: all children done |
SessionStarted |
{session_id, config, agent_model} |
ralph run |
SessionEnded |
{tasks_completed, tasks_failed, duration, tokens_used} |
ralph run ends |
AgentToolCall |
{task_id, tool_name, args_summary, result_summary} |
Tool invocation |
CLI Commands
| Command |
Description |
ralph init |
Initialize project with AllSource data dir |
ralph plan <prd.md> |
Planner agent decomposes PRD into tasks via CommandBus |
ralph run [--parallel N] |
Autonomous execution loop |
ralph run --task US-003 |
Execute a specific task |
ralph resume |
Resume from last session |
ralph status |
Show backlog projection |
ralph history <task-id> |
Full event history (EventQL) |
ralph analytics |
Cycle time, throughput, cost |
ralph mcp |
Start MCP server — expose backlog to any LLM |
ralph export |
Export to JSON/Beads format |
Configuration
# ralph.toml
[project]
name = "my-project"
data_dir = ".ralph"
[agent]
provider = "anthropic"
[agent.planner]
model = "claude-sonnet-4-5-20250514"
[agent.coder]
model = "claude-sonnet-4-5-20250514"
max_tokens = 16384
[agent.evaluator]
model = "claude-haiku-4-5-20251001"
[execution]
max_retries = 2
parallel = 1
sandbox = true
[mcp]
enabled = false # Enable MCP server alongside TUI
[tui]
theme = "dark"
Crate Structure
ralph/
├── Cargo.toml
├── crates/
│ ├── ralph-cli/ # Binary: CLI + TUI
│ │ ├── src/
│ │ │ ├── main.rs
│ │ │ ├── tui/ # ratatui panels
│ │ │ ├── commands/ # init, plan, run, status, history, mcp
│ │ │ └── config.rs
│ │ └── Cargo.toml
│ │
│ └── ralph-core/ # Library: domain + engine
│ ├── src/
│ │ ├── lib.rs
│ │ ├── domain/ # AllFrame CQRS domain
│ │ │ ├── events.rs # TaskEvent, EpicEvent, SessionEvent
│ │ │ ├── aggregates.rs # TaskAggregate, EpicAggregate
│ │ │ ├── commands.rs # CreateTask, CompleteTask, AssignTask
│ │ │ ├── projections.rs # BacklogProjection, EpicProgress
│ │ │ └── sagas.rs # EpicCompletionSaga, RetryFailedSaga
│ │ ├── agent/ # Rig agent definitions
│ │ │ ├── planner.rs # PRD → tasks
│ │ │ ├── coder.rs # Autonomous coding
│ │ │ ├── evaluator.rs # Completion judgment
│ │ │ └── tools.rs # Rig Tool impls → dispatch commands
│ │ ├── orchestrator.rs # SELECT→PROMPT→EXECUTE→EVALUATE loop
│ │ ├── mcp.rs # AllFrame MCP server setup
│ │ └── analytics.rs # EventQL queries
│ └── Cargo.toml
User Stories
US-001: Initialize project
As a developer, I want to run ralph init so that a .ralph/ directory is created with AllSource storage and AllFrame CQRS bootstrapped.
- AC: Creates config file;
AllSourceBackend::production(".ralph/data") initializes; EventStore, CommandBus, ProjectionRegistry wired up
- Priority: 1 | Labels:
core, setup
US-002: Plan from PRD
As a developer, I want to run ralph plan prd.md so that a rig planner agent decomposes my PRD into tasks dispatched as CreateTask commands through AllFrame's CommandBus.
- AC: Planner uses rig tools that dispatch commands;
TaskCreated and EpicCreated events stored in AllSource; backlog projection updated automatically
- Priority: 1 | Depends on: US-001 | Labels:
core, planning
US-003: Autonomous execution loop
As a developer, I want to run ralph run so that the orchestrator queries the backlog projection, invokes the coder agent, and dispatches completion commands.
- AC: Full SELECT→PROMPT→EXECUTE→EVALUATE loop; commands dispatched at each phase; AllFrame projections update in real-time; TUI reflects state
- Priority: 1 | Depends on: US-002 | Labels:
core, execution
US-004: Coder agent with file tools
As a developer, I want the coder agent to use rig tools (ReadFile, WriteFile, RunCommand, SearchCode) that log tool calls as AllSource events.
- AC: Tools sandboxed;
AgentToolCall events emitted for audit; file modifications tracked
- Priority: 1 | Depends on: US-003 | Labels:
core, agent
US-005: Evaluator agent
As a developer, I want an evaluator agent to judge task completion via rig's structured extraction, dispatching CompleteTask or FailTask commands.
- AC: Evaluator receives task description + acceptance criteria + agent output; returns structured pass/fail; command dispatched through CommandBus
- Priority: 1 | Depends on: US-004 | Labels:
core, agent
US-006: TUI dashboard
As a developer, I want a ratatui TUI showing the backlog projection, agent output, and epic progress.
- AC: Task list panel from BacklogProjection; agent output streaming; progress bars from EpicProgressProjection; keyboard navigation
- Priority: 2 | Depends on: US-003 | Labels:
tui
US-007: MCP server mode
As a developer, I want to run ralph mcp so that the task backlog is accessible to any LLM via MCP tools (list_tasks, create_task, complete_task, task_history).
- AC: AllFrame-MCP exposes handlers over stdio; Claude Code / Cursor can query and manage the backlog
- Priority: 2 | Depends on: US-003 | Labels:
mcp
US-008: Task history and analytics
As a developer, I want ralph history US-003 and ralph analytics powered by EventQL over AllSource.
- AC: Full event audit trail; cycle time, throughput, token usage analytics
- Priority: 2 | Depends on: US-003 | Labels:
analytics
US-009: Parallel execution with git worktrees
As a developer, I want ralph run --parallel 4 with each agent in a git worktree, coordinated by AllFrame sagas.
- AC: No double-claiming (AllSource version-based optimistic concurrency); worktree creation/merge automated;
RetryFailedSaga handles merge conflicts
- Priority: 2 | Depends on: US-004 | Labels:
parallel
US-010: Epic completion saga
As a developer, I want epics to auto-complete when all child tasks pass, triggered by AllFrame's saga orchestrator.
- AC:
EpicCompletionSaga listens for TaskCompleted; checks sibling status via projection; dispatches CompleteEpic if all done
- Priority: 2 | Depends on: US-005 | Labels:
sagas
US-011: Import from JSON/Beads
As a developer, I want ralph import prd.json to migrate existing tasks into AllSource events via CommandBus.
- AC: Reads JSON/Beads format; dispatches
CreateTask commands; preserves IDs, priorities, dependencies
- Priority: 3 | Labels:
migration
US-012: Multi-provider agent config
As a developer, I want per-role model/provider configuration so I can use Claude for coding, Haiku for evaluation, and GPT for planning.
- AC: Config supports per-role settings; rig handles provider switching
- Priority: 3 | Depends on: US-005 | Labels:
config
Dependencies
[dependencies]
# AllSource ecosystem
allframe = { version = "0.1", features = ["cqrs-allsource"] }
allframe-mcp = "0.1"
allsource-core = { version = "0.11", features = ["embedded"] }
# AI agents
rig-core = "0.31"
# TUI
ratatui = "0.29"
crossterm = "0.28"
# Runtime
tokio = { version = "1", features = ["full"] }
clap = { version = "4", features = ["derive"] }
serde = { version = "1", features = ["derive"] }
toml = "1"
tracing = "0.1"
Why This Stack
| Concern |
allframe |
allsource-core |
rig-core |
| Command validation |
CommandBus |
— |
— |
| Event persistence |
CQRS backend trait |
WAL + Parquet + DashMap |
— |
| Read models |
Projections + Registry |
— |
— |
| Multi-step coordination |
Sagas |
— |
— |
| Compile-time DI |
arch module |
— |
— |
| LLM tool exposure |
— |
— |
— |
| MCP server |
allframe-mcp |
— |
— |
| LLM calls |
— |
— |
Multi-provider agents |
| Tool calling |
— |
— |
Tool trait |
| Structured extraction |
— |
— |
Extractor agent |
| Event queries |
— |
EventQL (SQL) |
— |
| Analytics |
— |
Projections + EventQL |
— |
AllFrame handles the architecture (CQRS, DI, command routing, projections, sagas).
AllSource handles the durability (WAL, Parquet, crash-safe persistence, EventQL queries).
Rig handles the AI (multi-provider LLM calls, structured tool use, streaming).
AllFrame-MCP handles LLM accessibility (any MCP client can manage the backlog).
Open Questions
- Sandboxing: Containers, firejail, or custom sandbox for coder agent tools?
- Worktree management:
git2 crate or shell out to git worktree?
- Naming: Keep
ralph or rebrand?
- allframe-tauri: Should we also ship a desktop GUI version using AllFrame's Tauri plugin?
References
PRD: Rust Rewrite of Ralph TUI — AllFrame + Rig + AllSource Core
Problem Statement
Ralph TUI is a TypeScript/Bun AI agent loop orchestrator that manages autonomous coding agents through task backlogs. Its current architecture has limitations:
Proposed Solution
Rewrite ralph-tui in Rust using three AllSource ecosystem crates plus rig:
The key insight: AllFrame's CQRS module already has an AllSource backend (
AllSourceBackend<E>withcqrs-allsourcefeature). This means we don't wire AllSource manually — we define Aggregates, Commands, and Events using AllFrame's traits, and AllSource provides the durable event store underneath.Architecture
Four layers
ratatui) — Terminal UI with task list, agent output streaming, progress panelsallframe) — CommandBus dispatches commands → Aggregates validate → Events emitted → Projections materialize read models → Sagas coordinate multi-step workflowsallsource-coreviaAllSourceBackend) — Events durably stored in WAL + Parquet. DashMap for μs reads. EventQL for analytics queries.rig-core) — LLM agents for planning, coding, evaluation. Tool calls dispatch commands through the CommandBus.Why AllFrame matters
Without AllFrame, you'd manually wire: command validation → event emission → AllSource ingestion → projection updates → subscription notifications. AllFrame's CQRS module handles all of this declaratively:
AllFrame MCP: LLM-accessible task management
AllFrame-MCP auto-exposes handlers as MCP tools. This means any LLM (Claude Code, Cursor, etc.) can query and manage the ralph backlog:
This is a differentiator no other ralph-tui tracker has: the task backlog is natively accessible to LLMs as MCP tools, not just to the orchestrator's hardcoded agent integration.
Rig agents dispatch AllFrame commands
Instead of rig tools directly calling AllSource, they dispatch commands through AllFrame's CommandBus — getting validation, event emission, projection updates, and saga coordination for free:
Saga: Task Completion Cascade
When a task completes, a saga checks if all siblings in the epic are done, and if so, completes the epic:
Event Schema
All events use
entity_id = task-{id}andtenant_id = {project-name}.TaskCreated{id, title, description, acceptance_criteria, priority, depends_on, labels, epic_id}TaskAssigned{agent_id, session_id}TaskStarted{agent_type, model}TaskCompleted{passes: true, summary, duration_ms, files_changed}TaskFailed{error, retry_count, will_retry}TaskBlocked{blocked_by: [task_ids]}EpicCreated{id, title, prd_ref}EpicCompleted{child_count, duration_ms}SessionStarted{session_id, config, agent_model}ralph runSessionEnded{tasks_completed, tasks_failed, duration, tokens_used}ralph runendsAgentToolCall{task_id, tool_name, args_summary, result_summary}CLI Commands
ralph initralph plan <prd.md>ralph run [--parallel N]ralph run --task US-003ralph resumeralph statusralph history <task-id>ralph analyticsralph mcpralph exportConfiguration
Crate Structure
User Stories
US-001: Initialize project
As a developer, I want to run
ralph initso that a.ralph/directory is created with AllSource storage and AllFrame CQRS bootstrapped.AllSourceBackend::production(".ralph/data")initializes; EventStore, CommandBus, ProjectionRegistry wired upcore,setupUS-002: Plan from PRD
As a developer, I want to run
ralph plan prd.mdso that a rig planner agent decomposes my PRD into tasks dispatched asCreateTaskcommands through AllFrame's CommandBus.TaskCreatedandEpicCreatedevents stored in AllSource; backlog projection updated automaticallycore,planningUS-003: Autonomous execution loop
As a developer, I want to run
ralph runso that the orchestrator queries the backlog projection, invokes the coder agent, and dispatches completion commands.core,executionUS-004: Coder agent with file tools
As a developer, I want the coder agent to use rig tools (ReadFile, WriteFile, RunCommand, SearchCode) that log tool calls as AllSource events.
AgentToolCallevents emitted for audit; file modifications trackedcore,agentUS-005: Evaluator agent
As a developer, I want an evaluator agent to judge task completion via rig's structured extraction, dispatching
CompleteTaskorFailTaskcommands.core,agentUS-006: TUI dashboard
As a developer, I want a ratatui TUI showing the backlog projection, agent output, and epic progress.
tuiUS-007: MCP server mode
As a developer, I want to run
ralph mcpso that the task backlog is accessible to any LLM via MCP tools (list_tasks, create_task, complete_task, task_history).mcpUS-008: Task history and analytics
As a developer, I want
ralph history US-003andralph analyticspowered by EventQL over AllSource.analyticsUS-009: Parallel execution with git worktrees
As a developer, I want
ralph run --parallel 4with each agent in a git worktree, coordinated by AllFrame sagas.RetryFailedSagahandles merge conflictsparallelUS-010: Epic completion saga
As a developer, I want epics to auto-complete when all child tasks pass, triggered by AllFrame's saga orchestrator.
EpicCompletionSagalistens forTaskCompleted; checks sibling status via projection; dispatchesCompleteEpicif all donesagasUS-011: Import from JSON/Beads
As a developer, I want
ralph import prd.jsonto migrate existing tasks into AllSource events via CommandBus.CreateTaskcommands; preserves IDs, priorities, dependenciesmigrationUS-012: Multi-provider agent config
As a developer, I want per-role model/provider configuration so I can use Claude for coding, Haiku for evaluation, and GPT for planning.
configDependencies
Why This Stack
archmoduleTooltraitExtractoragentAllFrame handles the architecture (CQRS, DI, command routing, projections, sagas).
AllSource handles the durability (WAL, Parquet, crash-safe persistence, EventQL queries).
Rig handles the AI (multi-provider LLM calls, structured tool use, streaming).
AllFrame-MCP handles LLM accessibility (any MCP client can manage the backlog).
Open Questions
git2crate or shell out togit worktree?ralphor rebrand?References