A modern desktop client for interacting with Fabric AI patterns
Built with CustomTkinter for a sleek, dark-mode friendly experience
- Overview
- Architecture
- Features
- Installation
- Usage
- Configuration
- Keyboard Shortcuts
- Menu Reference
- Project Structure
- Logging
- Troubleshooting
- Version History
- Contributing
- Credits
Fabric Mini GUI provides a graphical desktop interface for the Fabric AI framework by Daniel Miessler. Instead of using the Fabric CLI directly, you get a modern dark-mode window where you can:
- Browse and search AI patterns visually
- Select from all your configured AI models (Claude, GPT-4, Ollama, etc.)
- Stream AI responses in real-time
- Manage the Fabric server with start/stop controls and live health monitoring
- Navigate through your response history
The entire application is a single Python file (fabricgui.py) with only two dependencies: requests and customtkinter.
High-level view of how Fabric GUI interacts with the Fabric ecosystem:
graph TB
subgraph FabricGUI["π₯οΈ Fabric GUI Application"]
GUI["FabricGUI<br/>(Main Window)"]
CM["ConfigManager"]
OH["OutputHistory"]
SM["ServerManager"]
PD["PreferencesDialog"]
CTX["ContextMenu"]
end
subgraph FabricCLI["βοΈ Fabric CLI"]
SERVE["fabric --serve"]
PATTERN["fabric -p pattern"]
end
subgraph Providers["βοΈ AI Providers"]
CLAUDE["Claude"]
GPT["GPT-4"]
OLLAMA["Ollama"]
OTHER["Other LLMs"]
end
subgraph Storage["πΎ Local Storage"]
CFG["~/.fabric_gui/config.json"]
HIST["~/.fabric_gui/history.json"]
LOG["~/.fabric_gui/fabric_gui.log"]
FENV["~/.config/fabric/.env"]
end
GUI -->|"Manages"| SM
GUI -->|"Reads/Writes"| CM
GUI -->|"Navigates"| OH
GUI -->|"Opens"| PD
GUI -->|"Attaches to text widgets"| CTX
SM -->|"Starts/Stops"| SERVE
SM -->|"GET /patterns/names"| SERVE
SM -->|"GET /models/names"| SERVE
SM -->|"Health checks"| SERVE
GUI -->|"Spawns subprocess"| PATTERN
PATTERN -->|"stdin: user text"| PATTERN
PATTERN -->|"stdout: AI response"| GUI
SERVE -->|"Routes to"| CLAUDE
SERVE -->|"Routes to"| GPT
SERVE -->|"Routes to"| OLLAMA
SERVE -->|"Routes to"| OTHER
CM -->|"Persists"| CFG
OH -->|"Persists"| HIST
GUI -->|"Logs to"| LOG
SM -->|"Reads default model"| FENV
style FabricGUI fill:#1a1a2e,stroke:#16213e,color:#e94560
style FabricCLI fill:#0f3460,stroke:#16213e,color:#e94560
style Providers fill:#533483,stroke:#16213e,color:#e94560
style Storage fill:#2b2b2b,stroke:#444,color:#ccc
Detailed view of all classes, their fields, and relationships:
classDiagram
class ConfigManager {
<<static>>
+Path CONFIG_FILE
+dict DEFAULT_CONFIG
+load() Dict~str, Any~
+save(config: Dict) void
}
class OutputHistory {
+Path HISTORY_FILE
-List~Dict~ history
-int max_size
-int current_index
+__init__(max_size: int)
+add(pattern, input_text, output_text)
+update_current_output(output_text)
+previous() Optional~Dict~
+next() Optional~Dict~
+has_previous() bool
+has_next() bool
+load() void
+save() void
}
class ServerManager {
-str fabric_command
-str base_url
-str port_flag
-Popen process
-bool is_online
-bool _monitoring
-Thread _health_thread
+__init__(fabric_command, base_url, port_flag)
+check_health() bool
+start_health_monitoring(interval, callback)
+stop_health_monitoring()
+start_server() bool
+stop_server(timeout) bool
+get_patterns() Optional~List~
+get_models() Dict~str, List~
+get_default_model() Optional~str~
+is_running() bool
-_normalize_base_url(url) str
-_port_from_base_url(base_url) str
-_start_server_output_capture()
}
class ContextMenu {
-Widget widget
-Menu menu
+__init__(widget: Widget)
-_gen(ev: str) void
-_select_all() void
-_show(event) void
}
class PreferencesDialog {
-CTk parent
-Dict config
-Optional~Dict~ result
-StringVar base_url_var
-BooleanVar auto_start_var
-BooleanVar stop_on_exit_var
-StringVar health_interval_var
-StringVar fabric_cmd_var
-StringVar timeout_var
+__init__(parent, config)
-_build_ui()
-_build_server_tab()
-_build_advanced_tab()
-_validate_and_collect() Optional~Dict~
-_on_save()
-_on_cancel()
}
class FabricGUI {
-Dict app_config
-OutputHistory history
-ServerManager server_manager
-bool cancel_request
-Thread current_request_thread
-Popen current_process
-StringVar base_url_var
-StringVar pattern_var
-StringVar model_var
-StringVar status_var
-StringVar command_var
-List~str~ all_patterns
+__init__()
-_build_menu()
-_build_server_frame()
-_build_pattern_frame()
-_build_info_frame()
-_build_io_frame()
-_setup_shortcuts()
+on_send(event)
+on_cancel()
+on_start_server()
+on_stop_server()
+on_test_server()
+load_patterns()
+load_models()
+save_output()
+copy_output()
+clear_output()
+import_file()
+show_preferences()
+show_help()
+on_closing()
-_process_request(input_text)
-_filter_patterns(*args)
-_update_led_status(is_online)
-_set_ui_processing(processing)
-_animate_progress()
}
FabricGUI --> ConfigManager : loads/saves config
FabricGUI --> OutputHistory : manages history
FabricGUI --> ServerManager : controls server
FabricGUI --> ContextMenu : attaches to text widgets
FabricGUI --> PreferencesDialog : opens dialog
FabricGUI --|> CTk : inherits
PreferencesDialog --|> CTkToplevel : inherits
Step-by-step view of what happens when the user clicks Send:
flowchart TD
A["π€ User clicks Send<br/>(or Ctrl+Enter)"] --> B{"Pattern<br/>selected?"}
B -->|No| C["β οΈ Show warning:<br/>Please select a pattern"]
B -->|Yes| D{"Server<br/>online?"}
D -->|No| E["π Prompt: Start server?"]
E -->|No| F["β Cancel"]
E -->|Yes| G["π Start server<br/>Wait for health check"]
G --> H
D -->|Yes| H["πΎ Save config<br/>(pattern, model, URL)"]
H --> I["π§ Build command:<br/>fabric -p pattern [-m model]"]
I --> J["π Disable Send button<br/>Enable Cancel button<br/>Start progress animation"]
J --> K["π€ Spawn subprocess<br/>(CREATE_NO_WINDOW on Windows)"]
K --> L["π Write input text to stdin<br/>(in background thread)"]
L --> M["π Add history entry"]
M --> N{"Read stdout<br/>chunk"}
N -->|Chunk received| O{"Cancel<br/>requested?"}
O -->|Yes| P["π Terminate process"]
O -->|No| Q{"Filter<br/>line?"}
Q -->|Yes: Ollama noise| N
Q -->|No| R["πΊ Append to output<br/>(via self.after)"]
R --> N
N -->|EOF + process done| S["β
Update status:<br/>Completed"]
S --> T["πΎ Save history"]
T --> U["π Re-enable Send<br/>Disable Cancel<br/>Stop animation"]
P --> V["π Set status: Cancelled"]
V --> U
style A fill:#1f6aa5,color:#fff
style C fill:#cc6600,color:#fff
style F fill:#cc3333,color:#fff
style P fill:#cc3333,color:#fff
style S fill:#2d8f2d,color:#fff
How ServerManager controls the Fabric server process:
stateDiagram-v2
[*] --> Offline : App starts
Offline --> Starting : on_start_server()
Starting --> Online : Health check passes
Starting --> Offline : Process exited immediately
Online --> Online : Health check (every 5s) β
Online --> Offline : Health check fails
Online --> Stopping : on_stop_server()
Stopping --> Offline : Process terminated
Stopping --> Offline : Force kill (after timeout)
Offline --> Online : Auto-start on launch
Offline --> Prompted : Send request while offline
Prompted --> Starting : User says "Yes"
Prompted --> Offline : User says "No"
state Online {
[*] --> Healthy
Healthy --> Healthy : GET /readyz β 200
Healthy --> Degraded : Request fails
Degraded --> Healthy : Next check passes
}
note right of Starting
Command: fabric --serve --address :PORT
Creates new process group (Windows)
Captures stdout in background thread
Waits 2s for process stability
end note
note right of Online
LED indicator: π’
Patterns auto-loaded
Models fetched
end note
note right of Offline
LED indicator: π΄
Server buttons update
end note
What happens when the application launches:
sequenceDiagram
participant User
participant GUI as FabricGUI.__init__
participant CM as ConfigManager
participant SM as ServerManager
participant FS as File System
User->>GUI: python fabricgui.py
GUI->>CM: load()
CM->>FS: Read ~/.fabric_gui/config.json
FS-->>CM: Config dict (or defaults)
CM-->>GUI: app_config
GUI->>GUI: OutputHistory()
GUI->>FS: Read ~/.fabric_gui/history.json
GUI->>SM: ServerManager(fabric_cmd, base_url, port_flag)
GUI->>GUI: Build UI components
Note over GUI: _build_menu()<br/>_build_server_frame()<br/>_build_pattern_frame()<br/>_build_info_frame()<br/>_build_io_frame()<br/>_setup_shortcuts()
GUI->>SM: start_health_monitoring(interval=5s)
Note over SM: Background thread begins<br/>GET /readyz every 5s
alt auto_start_server = true
GUI->>SM: start_server() (after 600ms)
SM->>SM: fabric --serve --address :PORT
end
GUI->>SM: load_patterns() (after 800ms)
SM->>SM: GET /patterns/names
SM-->>GUI: Pattern list β dropdown
GUI->>SM: load_models() (after 1200ms)
SM->>SM: fabric --listmodels
SM-->>GUI: Models by provider β dropdown
SM->>FS: Read ~/.config/fabric/.env
FS-->>SM: DEFAULT_MODEL
SM-->>GUI: Default model label
GUI-->>User: Window ready
Visual map of how the UI panels are organized:
block-beta
columns 1
block:top["Server Frame"]
columns 6
LED["π΄ LED"] URL["Base URL Label"] Test["Test"] Start["Start"] Stop["Stop"] space
end
block:pattern["Pattern Selection Frame"]
columns 4
Search["π Search Input"] space:3
PatternLabel["Pattern:"] PatternCombo["Pattern Dropdown βΌ"] Refresh["Refresh Patterns"] space
ModelLabel["Model:"] ModelCombo["Model Dropdown βΌ"] DefaultModel["Default: model-name"] space
end
block:info["Info / Actions Frame"]
columns 3
Status["Status: Ready"] CommandPreview["Command: fabric -p pattern"] Actions["Cancel | Send"]
end
block:io["Input / Output Frame"]
columns 2
block:input["Input Panel"]
InputToolbar["Import | Paste | Clear"]
InputText["π Input Textbox"]
end
block:output["Output Panel"]
OutputToolbar["Copy | Save | Clear | β βΆ"]
OutputText["π Output Textbox"]
end
end
- Pattern selection and execution via Fabric CLI subprocess
- Real-time streaming responses β output appears chunk-by-chunk as the AI generates it
- Model selection β browse all configured AI models grouped by provider
- Command preview β see the exact
fabriccommand before executing
- CustomTkinter integration for a modern, dark-mode friendly aesthetic
- Clean, intuitive layout with organized frames
- Animated progress indicator with pulsing gold dots during processing
- Output toolbar (Copy, Save, Clear)
- History navigation (β/βΆ buttons)
- Status bar with visual feedback
- Context Menus β right-click support for Cut/Copy/Paste/Select All
- Interactive Pattern Search β real-time filtering with match count display
- Visual LED status indicator (π΄ offline / π’ online)
- Start/Stop server controls directly from the GUI
- Automatic health monitoring (configurable interval, default 5s)
- Auto-Load Patterns β patterns and models load automatically when server comes online
- Pre-request server validation with auto-start prompt
- Graceful shutdown handling with force-kill fallback
- Persistent settings via JSON (auto-saved)
- Tabbed Preferences dialog (Server + Advanced)
- Configurable server URL, health check interval, and request timeout
- Window geometry persistence across sessions
- Copy output to clipboard
- Save output to file (TXT/MD)
- Clear output display
- Navigate through response history (up to 50 entries, persisted to disk)
- Import
.txtor.mdfiles directly into the input box via the Import button
- Python 3.11+
- Fabric CLI installed and on PATH
requestslibrarycustomtkinterlibrary
- Clone or download this repository
- Double-click
start.bat- First run: creates a virtual environment, installs dependencies, and launches the app
- Subsequent runs: activates the venv and launches immediately
# Clone the repository
git clone https://github.com/Digitalgods2/FabricGui.git
cd FabricGui
# Install dependencies
pip install -r requirements.txt
# Run the application
python fabricgui.pypip install pyinstaller
pyinstaller fabricgui.specThe standalone FabricGUI.exe will be created in the dist/ folder (no console window).
-
Configure Server
- The default server URL is
http://localhost:8083 - Open Edit β Preferences to change the URL or other settings
- Click Test to verify connectivity
- The default server URL is
-
Start the Server
- Click Start to launch Fabric's built-in server
- Wait for the LED indicator to turn π’ green
- Patterns and models load automatically
-
Select Your Model
- The Model dropdown shows all available AI models grouped by provider
- Click the "Default: ..." label to reset to your configured default model
- Select a pattern from the dropdown (use the search box to filter)
- Enter your input text (or click Import to load a file)
- Review the command preview
- Click Send or press
Ctrl+Enter - Watch the response stream in the output panel
| Action | Button | Shortcut |
|---|---|---|
| Copy to clipboard | Copy | Ctrl+C |
| Save to file | Save | Ctrl+S |
| Clear display | Clear | β |
| Previous history | β | Alt+Left |
| Next history | βΆ | Alt+Right |
Status Indicators:
| LED | Meaning |
|---|---|
| π΄ Red | Server offline |
| π’ Green | Server online |
Auto-Start: Enable in Preferences to start the server automatically on app launch.
Pre-Request Validation: If the server is offline when you send a request, the app prompts: "Would you like to start it now?"
Click the Cancel button during processing. The subprocess will be terminated gracefully (or force-killed after a timeout).
Settings are stored at ~/.fabric_gui/config.json and auto-saved:
{
"base_url": "http://localhost:8083",
"last_pattern": "",
"last_model": "",
"window_geometry": "1200x800",
"auto_start_server": false,
"stop_server_on_exit": true,
"server_health_check_interval": 5,
"request_timeout": 300,
"fabric_command": "fabric",
"port_flag": "--address"
}| Setting | Default | Description |
|---|---|---|
base_url |
http://localhost:8083 |
Fabric server address |
auto_start_server |
false |
Launch server on app start |
stop_server_on_exit |
true |
Prompt to stop server on close |
server_health_check_interval |
5 |
Health check frequency (seconds) |
request_timeout |
300 |
Max processing time (seconds) |
fabric_command |
fabric |
Path to Fabric executable |
port_flag |
--address |
Server bind flag (auto-corrected from --port) |
| Shortcut | Action |
|---|---|
Ctrl+Enter |
Send request |
Ctrl+S |
Save output to file |
Ctrl+C |
Copy output to clipboard |
Ctrl+, |
Open Preferences |
Alt+Left |
Previous history entry |
Alt+Right |
Next history entry |
| Right-click | Context menu (Cut/Copy/Paste/Select All) |
| Menu | Item | Shortcut | Action |
|---|---|---|---|
| File | Save Output... | Ctrl+S |
Save output to file |
| Exit | β | Close application | |
| Edit | Preferences... | Ctrl+, |
Open settings dialog |
| Copy Output | Ctrl+C |
Copy to clipboard | |
| Clear Output | β | Clear output display | |
| Paste Input | β | Paste from clipboard | |
| Clear Input | β | Clear input text | |
| History | Previous | Alt+Left |
Previous response |
| Next | Alt+Right |
Next response | |
| Help | User Guide | β | Comprehensive help window |
| View Logs | β | Open log file | |
| About | β | Application info |
FabricGui/
βββ fabricgui.py # Entire application (1,930 lines)
βββ fabricgui.spec # PyInstaller build configuration
βββ requirements.txt # Python dependencies
βββ start.bat # Windows quick-start launcher
βββ README.md # This file
βββ .gitignore
βββ build/ # PyInstaller build artifacts
Runtime files (created automatically):
~/.fabric_gui/
βββ config.json # Persistent configuration
βββ history.json # Response history (up to 50 entries)
βββ fabric_gui.log # Rotating log files (5 MB Γ 3 backups)
| Class | Lines | Responsibility |
|---|---|---|
ConfigManager |
248β314 | Loads/saves JSON config with defaults and migration |
OutputHistory |
321β387 | Manages response history with persistence |
ServerManager |
394β660 | Controls Fabric server process, health checks, pattern/model fetching |
ContextMenu |
667β714 | Right-click Cut/Copy/Paste/Select All for any text widget |
PreferencesDialog |
721β898 | Tabbed settings dialog with validation |
FabricGUI |
905β1924 | Main window β all UI building, event handling, and request processing |
Application logs are saved to ~/.fabric_gui/fabric_gui.log
- Max file size: 5 MB per file
- Backup files: 3 rotated backups
- Total max: ~20 MB
- View logs: Help β View Logs
- Verify the Fabric server is running (LED should be π’)
- Check the base URL in Preferences
- Click Test to diagnose connectivity
- Ensure the server is accessible and responding
- Check API key if required
- Review logs (Help β View Logs) for detailed error messages
- Increase timeout in Edit β Preferences β Advanced
- Check network connection
- Verify the server is responding (click Test)
- This is normal if you are not running a local Ollama instance
- The Fabric CLI automatically checks for local models on startup
- The GUI filters this message to keep output clean
- It does not affect cloud models (Claude, GPT-4, etc.)
- Ensure
fabricis installed and on your system PATH - Try running
fabric --servein a terminal to see errors directly - Check that the port (default 8083) is not already in use
- β Interactive Pattern Search: Real-time filtering with auto-selection of first match
- β Improved Readability: Enhanced dropdown styling with better contrast, taller dropdowns (25 items)
- β UI Polish: Consistent font sizing, status bar shows match count while searching
- β Reliable AI Processing: Switched to direct subprocess execution for guaranteed correct output
- β Enhanced UI: Vertical scrollbar on pattern dropdown, expanded list (40 items), increased fonts
- β Smarter Server Management: Improved stop logic, better error handling
- β Bug Fixes: Output buffering/hanging, encoding issues with emojis, filtered startup errors
- β CustomTkinter Migration: Complete UI overhaul with modern look and dark mode
- β Context Menus: Right-click support for text widgets
- β Bug Fixes: Infinite pattern loading loop, endpoint fix, TypeError, startup crashes
- β Server management with Start/Stop controls and LED indicator
- β Automatic health monitoring and pre-request validation
- β Auto-start server option and graceful shutdown
- β Cross-platform process management
- β Complete refactoring with improved architecture
- β Configuration persistence, output history, request cancellation
- β Progress indicators, menu system, keyboard shortcuts
- β Comprehensive logging
- Basic Fabric pattern execution with input/output interface
Suggestions and improvements are welcome! Please ensure:
- Code follows existing style
- All features are tested
- Documentation is updated
Fabric GUI designed and developed by DigitalGods.ai
Built for the Fabric AI framework by Daniel Miessler.
Happy pattern processing! π