|
1 | | -<img src="/docs/assets/images/rubyllm-mcp-logo-text.svg" alt="RubyLLM" height="120" width="250"> |
| 1 | +<div align="center"> |
| 2 | + <img src="/docs/assets/images/rubyllm-mcp-logo-text.svg" alt="RubyLLM::MCP" height="120" width="250"> |
2 | 3 |
|
3 | | -**Aiming to make using MCPs with RubyLLM and Ruby as easy as possible.** |
| 4 | + <strong>MCP made simple for RubyLLM.</strong> |
4 | 5 |
|
5 | | -This project is a Ruby client for the [Model Context Protocol (MCP)](https://modelcontextprotocol.io/), designed to work seamlessly with [RubyLLM](https://github.com/crmne/ruby_llm). This gem enables Ruby applications to connect to MCP servers and use their tools, resources and prompts as part of LLM conversations. |
6 | | - |
7 | | -For a more detailed guide, see the [RubyLLM::MCP docs](https://rubyllm-mcp.com/). |
8 | | - |
9 | | -Currently full support for MCP protocol version up to `2025-06-18`. |
10 | | - |
11 | | -<div class="badge-container"> |
12 | | - <a href="https://badge.fury.io/rb/ruby_llm-mcp"><img src="https://badge.fury.io/rb/ruby_llm-mcp.svg" alt="Gem Version" /></a> |
13 | | - <a href="https://rubygems.org/gems/ruby_llm-mcp"><img alt="Gem Downloads" src="https://img.shields.io/gem/dt/ruby_llm-mcp"></a> |
| 6 | + <p> |
| 7 | + <a href="https://badge.fury.io/rb/ruby_llm-mcp"><img src="https://badge.fury.io/rb/ruby_llm-mcp.svg" alt="Gem Version" /></a> |
| 8 | + <a href="https://rubygems.org/gems/ruby_llm-mcp"><img alt="Gem Downloads" src="https://img.shields.io/gem/dt/ruby_llm-mcp"></a> |
| 9 | + </p> |
14 | 10 | </div> |
15 | 11 |
|
16 | | -## RubyLLM::MCP Features |
17 | | - |
18 | | -- 🎛️ **Dual SDK Support** _(v0.8+)_: Choose between native full-featured implementation or official MCP SDK |
19 | | -- 🔌 **Multiple Transport Types**: Streamable HTTP, STDIO, and SSE transports |
20 | | -- 🛠️ **Tool Integration**: Automatically converts MCP tools into RubyLLM-compatible tools |
21 | | -- 📄 **Resource Management**: Access and include MCP resources (files, data) and resource templates in conversations |
22 | | -- 🎯 **Prompt Integration**: Use predefined MCP prompts with arguments for consistent interactions |
23 | | -- 🎨 **Client Features**: Support for sampling, roots, progress tracking, human-in-the-loop, and elicitation |
24 | | -- 🔧 **Enhanced Chat Interface**: Extended RubyLLM chat methods for seamless MCP integration |
25 | | -- 🔄 **Multiple Client Management**: Create and manage multiple MCP clients simultaneously for different servers and purposes |
26 | | -- 📚 **Simple API**: Easy-to-use interface that integrates seamlessly with RubyLLM |
27 | | - |
28 | | -## Installation |
29 | | - |
30 | | -```bash |
31 | | -bundle add ruby_llm-mcp |
32 | | -``` |
| 12 | +`ruby_llm-mcp` connects Ruby applications to [Model Context Protocol (MCP)](https://modelcontextprotocol.io/) servers and integrates them directly with [RubyLLM](https://github.com/crmne/ruby_llm). |
33 | 13 |
|
34 | | -or add this line to your application's Gemfile: |
| 14 | +## Simple Configuration |
35 | 15 |
|
36 | 16 | ```ruby |
37 | | -gem 'ruby_llm-mcp' |
38 | | -``` |
39 | | - |
40 | | -And then execute: |
41 | | - |
42 | | -```bash |
43 | | -bundle install |
44 | | -``` |
45 | | - |
46 | | -Or install it yourself as: |
47 | | - |
48 | | -```bash |
49 | | -gem install ruby_llm-mcp |
50 | | -``` |
51 | | - |
52 | | -## Choosing an Adapter |
53 | | - |
54 | | -Starting with version 0.8.0, RubyLLM MCP supports multiple SDK adapters: |
| 17 | +require 'ruby_llm/mcp' |
55 | 18 |
|
56 | | -### RubyLLM Adapter (Default) |
| 19 | +RubyLLM.configure do |config| |
| 20 | + config.openai_api_key = ENV.fetch('OPENAI_API_KEY') |
| 21 | +end |
57 | 22 |
|
58 | | -The native implementation with full MCP protocol support: |
| 23 | +RubyLLM::MCP.configure do |config| |
| 24 | + config.request_timeout = 8_000 |
| 25 | +end |
59 | 26 |
|
60 | | -```ruby |
61 | 27 | client = RubyLLM::MCP.client( |
62 | | - name: "server", |
63 | | - adapter: :ruby_llm, # Default, can be omitted |
| 28 | + name: 'filesystem', |
64 | 29 | transport_type: :stdio, |
65 | | - config: { command: "mcp-server" } |
| 30 | + config: { |
| 31 | + command: 'npx', |
| 32 | + args: ['@modelcontextprotocol/server-filesystem', Dir.pwd] |
| 33 | + } |
66 | 34 | ) |
67 | 35 | ``` |
68 | 36 |
|
69 | | -**Features**: All MCP features including SSE transport, sampling, roots, progress tracking, etc. |
70 | | - |
71 | | -### MCP SDK Adapter |
72 | | - |
73 | | -The official Anthropic-maintained SDK: |
| 37 | +## Core Use Cases |
74 | 38 |
|
75 | 39 | ```ruby |
76 | | -# Add to Gemfile |
77 | | -gem 'mcp', '~> 0.7' |
| 40 | +# Use MCP tools in a chat |
| 41 | +chat = RubyLLM.chat(model: 'gpt-4o-mini') |
| 42 | +chat.with_tools(*client.tools) |
78 | 43 |
|
79 | | -# Use in code |
80 | | -client = RubyLLM::MCP.client( |
81 | | - name: "server", |
82 | | - adapter: :mcp_sdk, |
83 | | - transport_type: :stdio, |
84 | | - config: { command: "mcp-server" } |
85 | | -) |
| 44 | +puts chat.ask('List the Ruby files in this project and summarize what you find.') |
86 | 45 | ``` |
87 | 46 |
|
88 | | -**Features**: Core MCP features (tools, resources, prompts, resource templates, logging). No sampling, roots, or other advanced client features. |
89 | | - |
90 | | -See the [Adapters Guide](https://rubyllm-mcp.com/guides/adapters.html) for detailed comparison. |
91 | | - |
92 | | -## Usage |
93 | | - |
94 | | -### Basic Setup |
95 | | - |
96 | | -First, configure your RubyLLM client and create an MCP connection: |
97 | | - |
98 | 47 | ```ruby |
99 | | -require 'ruby_llm/mcp' |
100 | | - |
101 | | -# Configure RubyLLM |
102 | | -RubyLLM.configure do |config| |
103 | | - config.openai_api_key = "your-api-key" |
104 | | -end |
105 | | - |
106 | | -# Connect to an MCP server via SSE |
107 | | -client = RubyLLM::MCP.client( |
108 | | - name: "my-mcp-server", |
109 | | - transport_type: :sse, |
110 | | - config: { |
111 | | - url: "http://localhost:9292/mcp/sse" |
112 | | - } |
113 | | -) |
| 48 | +# Add a server resource to chat context |
| 49 | +resource = client.resource('project_readme') |
114 | 50 |
|
115 | | -# Or connect via stdio |
116 | | -client = RubyLLM::MCP.client( |
117 | | - name: "my-mcp-server", |
118 | | - transport_type: :stdio, |
119 | | - config: { |
120 | | - command: "node", |
121 | | - args: ["path/to/mcp-server.js"], |
122 | | - env: { "NODE_ENV" => "production" } |
123 | | - } |
124 | | -) |
| 51 | +chat = RubyLLM.chat(model: 'gpt-4o-mini') |
| 52 | +chat.with_resource(resource) |
125 | 53 |
|
126 | | -# Or connect via streamable HTTP |
127 | | -client = RubyLLM::MCP.client( |
128 | | - name: "my-mcp-server", |
129 | | - transport_type: :streamable, |
130 | | - config: { |
131 | | - url: "http://localhost:8080/mcp", |
132 | | - headers: { "Authorization" => "Bearer your-token" } |
133 | | - } |
134 | | -) |
| 54 | +puts chat.ask('Summarize this project in 5 bullets.') |
135 | 55 | ``` |
136 | 56 |
|
137 | | -### Using MCP Tools with RubyLLM |
138 | | - |
139 | 57 | ```ruby |
140 | | -# Get available tools from the MCP server |
141 | | -tools = client.tools |
142 | | -puts "Available tools:" |
143 | | -tools.each do |tool| |
144 | | - puts "- #{tool.name}: #{tool.description}" |
145 | | -end |
| 58 | +# Execute a predefined MCP prompt with arguments |
| 59 | +prompt = client.prompt('code_review') |
| 60 | +chat = RubyLLM.chat(model: 'gpt-4o-mini') |
146 | 61 |
|
147 | | -# Create a chat session with MCP tools |
148 | | -chat = RubyLLM.chat(model: "gpt-4") |
149 | | -chat.with_tools(*client.tools) |
| 62 | +response = chat.ask_prompt(prompt, arguments: { |
| 63 | + language: 'ruby', |
| 64 | + focus: 'security' |
| 65 | +}) |
150 | 66 |
|
151 | | -# Ask a question that will use the MCP tools |
152 | | -response = chat.ask("Can you help me search for recent files in my project?") |
153 | 67 | puts response |
154 | 68 | ``` |
155 | 69 |
|
156 | | -### Manual Tool Execution |
157 | | - |
158 | | -You can also execute MCP tools directly: |
159 | | - |
160 | 70 | ```ruby |
161 | | -# Tools Execution |
162 | | -tool = client.tool("search_files") |
163 | | - |
164 | | -# Execute a specific tool |
165 | | -result = tool.execute( |
166 | | - name: "search_files", |
167 | | - parameters: { |
168 | | - query: "*.rb", |
169 | | - directory: "/path/to/search" |
| 71 | +# Authenticate to a protected MCP server with browser OAuth |
| 72 | +client = RubyLLM::MCP.client( |
| 73 | + name: 'oauth-server', |
| 74 | + transport_type: :streamable, |
| 75 | + start: false, |
| 76 | + config: { |
| 77 | + url: 'https://mcp.example.com/mcp', |
| 78 | + oauth: { scope: 'mcp:read mcp:write' } |
170 | 79 | } |
171 | 80 | ) |
172 | 81 |
|
173 | | -puts result |
| 82 | +client.oauth(type: :browser).authenticate |
| 83 | +client.start |
174 | 84 | ``` |
175 | 85 |
|
176 | | -### Working with Resources |
177 | | - |
178 | | -MCP servers can provide access to resources - structured data that can be included in conversations. Resources come in two types: normal resources and resource templates. |
179 | | - |
180 | | -#### Normal Resources |
181 | | - |
182 | 86 | ```ruby |
183 | | -# Get available resources from the MCP server |
184 | | -resources = client.resources |
185 | | -puts "Available resources:" |
186 | | -resources.each do |resource| |
187 | | - puts "- #{resource.name}: #{resource.description}" |
| 87 | +# Poll a long-running MCP task and fetch its final result |
| 88 | +task = client.task_get('task-123') |
| 89 | + |
| 90 | +until task.completed? || task.failed? || task.cancelled? |
| 91 | + sleep((task.poll_interval || 250) / 1000.0) |
| 92 | + task = task.refresh |
188 | 93 | end |
189 | 94 |
|
190 | | -# Access a specific resource by name |
191 | | -file_resource = client.resource("project_readme") |
192 | | -content = file_resource.content |
193 | | -puts "Resource content: #{content}" |
| 95 | +if task.completed? |
| 96 | + payload = client.task_result(task.task_id) |
| 97 | + puts payload.dig('content', 0, 'text') |
| 98 | +else |
| 99 | + puts "Task ended with status: #{task.status}" |
| 100 | +end |
| 101 | +``` |
194 | 102 |
|
195 | | -# Include a resource in a chat conversation for reference with an LLM |
196 | | -chat = RubyLLM.chat(model: "gpt-4") |
197 | | -chat.with_resource(file_resource) |
| 103 | +## Support At A Glance |
198 | 104 |
|
199 | | -# Or add a resource directly to the conversation |
200 | | -file_resource.include(chat) |
| 105 | +- **Native MCP client implementation (`:ruby_llm`)** with full protocol support through `2025-11-25` |
| 106 | +- **Official MCP SDK adapter support (`:mcp_sdk`)** via the `mcp` gem for teams that prefer SDK-backed integration |
| 107 | +- **OAuth implementation** for authenticated streamable HTTP MCP servers |
| 108 | +- **Transports:** `stdio`, `sse`, `streamable` / `streamable_http` |
| 109 | +- **Core server features:** tools, resources, resource templates, prompts, notifications |
| 110 | +- **Advanced client features:** sampling, roots, progress tracking, human-in-the-loop, elicitation |
| 111 | +- **Task lifecycle APIs** (`tasks/list`, `tasks/get`, `tasks/result`, `tasks/cancel`) are experimental |
201 | 112 |
|
202 | | -response = chat.ask("Can you summarize this README file?") |
203 | | -puts response |
204 | | -``` |
| 113 | +> [!WARNING] |
| 114 | +> MCP task support is experimental and subject to change in both the MCP spec and this gem's implementation. |
205 | 115 |
|
206 | | -#### Resource Templates |
| 116 | +## Install |
207 | 117 |
|
208 | | -Resource templates are parameterized resources that can be dynamically configured: |
| 118 | +Add to your Gemfile: |
209 | 119 |
|
210 | 120 | ```ruby |
211 | | -# Get available resource templates |
212 | | -templates = client.resource_templates |
213 | | -log_template = client.resource_template("application_logs") |
214 | | - |
215 | | -# Use a template with parameters |
216 | | -chat = RubyLLM.chat(model: "gpt-4") |
217 | | -chat.with_resource_template(log_template, arguments: { |
218 | | - date: "2024-01-15", |
219 | | - level: "error" |
220 | | -}) |
221 | | - |
222 | | -response = chat.ask("What errors occurred on this date?") |
223 | | -puts response |
224 | | - |
225 | | -# You can also get templated content directly |
226 | | -content = log_template.to_content(arguments: { |
227 | | - date: "2024-01-15", |
228 | | - level: "error" |
229 | | -}) |
230 | | -puts content |
| 121 | +gem 'ruby_llm-mcp' |
231 | 122 | ``` |
232 | 123 |
|
233 | | -### Working with Prompts |
234 | | - |
235 | | -MCP servers can provide predefined prompts that can be used in conversations: |
| 124 | +Optional (for `:mcp_sdk` adapter): |
236 | 125 |
|
237 | 126 | ```ruby |
238 | | -# Get available prompts from the MCP server |
239 | | -prompts = client.prompts |
240 | | -puts "Available prompts:" |
241 | | -prompts.each do |prompt| |
242 | | - puts "- #{prompt.name}: #{prompt.description}" |
243 | | - prompt.arguments.each do |arg| |
244 | | - puts " - #{arg.name}: #{arg.description} (required: #{arg.required})" |
245 | | - end |
246 | | -end |
247 | | - |
248 | | -# Use a prompt in a conversation |
249 | | -greeting_prompt = client.prompt("daily_greeting") |
250 | | -chat = RubyLLM.chat(model: "gpt-4") |
251 | | - |
252 | | -# Method 1: Ask prompt directly |
253 | | -response = chat.ask_prompt(greeting_prompt, arguments: { name: "Alice", time: "morning" }) |
254 | | -puts response |
255 | | - |
256 | | -# Method 2: Add prompt to chat and then ask |
257 | | -chat.with_prompt(greeting_prompt, arguments: { name: "Alice", time: "morning" }) |
258 | | -response = chat.ask("Continue with the greeting") |
| 127 | +gem 'mcp', '~> 0.7' |
259 | 128 | ``` |
260 | 129 |
|
261 | | -## Development |
262 | | - |
263 | | -After checking out the repo, run `bundle` to install dependencies. Then, run `bundle exec rake` to run the tests. Tests currently use `bun` to run test MCP servers You can also run `bin/console` for an interactive prompt that will allow you to experiment. |
264 | | - |
265 | | -There are also examples you you can run to verify the gem is working as expected. |
| 130 | +Then run: |
266 | 131 |
|
267 | 132 | ```bash |
268 | | -bundle exec ruby examples/tools/local_mcp.rb |
| 133 | +bundle install |
269 | 134 | ``` |
270 | 135 |
|
| 136 | +## Setup |
| 137 | + |
| 138 | +1. Set your RubyLLM provider credentials (for example `OPENAI_API_KEY`). |
| 139 | +2. Start or access an MCP server. |
| 140 | +3. Create a `RubyLLM::MCP.client` and attach its tools/resources/prompts to chat flows. |
| 141 | + |
271 | 142 | ## Contributing |
272 | 143 |
|
273 | | -We welcome contributions! Bug reports and pull requests are welcome on GitHub at https://github.com/patvice/ruby_llm-mcp. |
| 144 | +Bug reports and pull requests are welcome on [GitHub](https://github.com/patvice/ruby_llm-mcp). |
274 | 145 |
|
275 | 146 | ## License |
276 | 147 |
|
|
0 commit comments