Skip to content

Commit 00ceff0

Browse files
authored
Merge pull request #114 from patvice/spec-2025-11-25-support
Add MCP 2025-11-25 support with tasks + schema updates
2 parents 100c54b + b58f140 commit 00ceff0

55 files changed

Lines changed: 6408 additions & 430 deletions

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

README.md

Lines changed: 85 additions & 214 deletions
Original file line numberDiff line numberDiff line change
@@ -1,276 +1,147 @@
1-
<img src="/docs/assets/images/rubyllm-mcp-logo-text.svg" alt="RubyLLM" height="120" width="250">
1+
<div align="center">
2+
<img src="/docs/assets/images/rubyllm-mcp-logo-text.svg" alt="RubyLLM::MCP" height="120" width="250">
23

3-
**Aiming to make using MCPs with RubyLLM and Ruby as easy as possible.**
4+
<strong>MCP made simple for RubyLLM.</strong>
45

5-
This project is a Ruby client for the [Model Context Protocol (MCP)](https://modelcontextprotocol.io/), designed to work seamlessly with [RubyLLM](https://github.com/crmne/ruby_llm). This gem enables Ruby applications to connect to MCP servers and use their tools, resources and prompts as part of LLM conversations.
6-
7-
For a more detailed guide, see the [RubyLLM::MCP docs](https://rubyllm-mcp.com/).
8-
9-
Currently full support for MCP protocol version up to `2025-06-18`.
10-
11-
<div class="badge-container">
12-
<a href="https://badge.fury.io/rb/ruby_llm-mcp"><img src="https://badge.fury.io/rb/ruby_llm-mcp.svg" alt="Gem Version" /></a>
13-
<a href="https://rubygems.org/gems/ruby_llm-mcp"><img alt="Gem Downloads" src="https://img.shields.io/gem/dt/ruby_llm-mcp"></a>
6+
<p>
7+
<a href="https://badge.fury.io/rb/ruby_llm-mcp"><img src="https://badge.fury.io/rb/ruby_llm-mcp.svg" alt="Gem Version" /></a>
8+
<a href="https://rubygems.org/gems/ruby_llm-mcp"><img alt="Gem Downloads" src="https://img.shields.io/gem/dt/ruby_llm-mcp"></a>
9+
</p>
1410
</div>
1511

16-
## RubyLLM::MCP Features
17-
18-
- 🎛️ **Dual SDK Support** _(v0.8+)_: Choose between native full-featured implementation or official MCP SDK
19-
- 🔌 **Multiple Transport Types**: Streamable HTTP, STDIO, and SSE transports
20-
- 🛠️ **Tool Integration**: Automatically converts MCP tools into RubyLLM-compatible tools
21-
- 📄 **Resource Management**: Access and include MCP resources (files, data) and resource templates in conversations
22-
- 🎯 **Prompt Integration**: Use predefined MCP prompts with arguments for consistent interactions
23-
- 🎨 **Client Features**: Support for sampling, roots, progress tracking, human-in-the-loop, and elicitation
24-
- 🔧 **Enhanced Chat Interface**: Extended RubyLLM chat methods for seamless MCP integration
25-
- 🔄 **Multiple Client Management**: Create and manage multiple MCP clients simultaneously for different servers and purposes
26-
- 📚 **Simple API**: Easy-to-use interface that integrates seamlessly with RubyLLM
27-
28-
## Installation
29-
30-
```bash
31-
bundle add ruby_llm-mcp
32-
```
12+
`ruby_llm-mcp` connects Ruby applications to [Model Context Protocol (MCP)](https://modelcontextprotocol.io/) servers and integrates them directly with [RubyLLM](https://github.com/crmne/ruby_llm).
3313

34-
or add this line to your application's Gemfile:
14+
## Simple Configuration
3515

3616
```ruby
37-
gem 'ruby_llm-mcp'
38-
```
39-
40-
And then execute:
41-
42-
```bash
43-
bundle install
44-
```
45-
46-
Or install it yourself as:
47-
48-
```bash
49-
gem install ruby_llm-mcp
50-
```
51-
52-
## Choosing an Adapter
53-
54-
Starting with version 0.8.0, RubyLLM MCP supports multiple SDK adapters:
17+
require 'ruby_llm/mcp'
5518

56-
### RubyLLM Adapter (Default)
19+
RubyLLM.configure do |config|
20+
config.openai_api_key = ENV.fetch('OPENAI_API_KEY')
21+
end
5722

58-
The native implementation with full MCP protocol support:
23+
RubyLLM::MCP.configure do |config|
24+
config.request_timeout = 8_000
25+
end
5926

60-
```ruby
6127
client = RubyLLM::MCP.client(
62-
name: "server",
63-
adapter: :ruby_llm, # Default, can be omitted
28+
name: 'filesystem',
6429
transport_type: :stdio,
65-
config: { command: "mcp-server" }
30+
config: {
31+
command: 'npx',
32+
args: ['@modelcontextprotocol/server-filesystem', Dir.pwd]
33+
}
6634
)
6735
```
6836

69-
**Features**: All MCP features including SSE transport, sampling, roots, progress tracking, etc.
70-
71-
### MCP SDK Adapter
72-
73-
The official Anthropic-maintained SDK:
37+
## Core Use Cases
7438

7539
```ruby
76-
# Add to Gemfile
77-
gem 'mcp', '~> 0.7'
40+
# Use MCP tools in a chat
41+
chat = RubyLLM.chat(model: 'gpt-4o-mini')
42+
chat.with_tools(*client.tools)
7843

79-
# Use in code
80-
client = RubyLLM::MCP.client(
81-
name: "server",
82-
adapter: :mcp_sdk,
83-
transport_type: :stdio,
84-
config: { command: "mcp-server" }
85-
)
44+
puts chat.ask('List the Ruby files in this project and summarize what you find.')
8645
```
8746

88-
**Features**: Core MCP features (tools, resources, prompts, resource templates, logging). No sampling, roots, or other advanced client features.
89-
90-
See the [Adapters Guide](https://rubyllm-mcp.com/guides/adapters.html) for detailed comparison.
91-
92-
## Usage
93-
94-
### Basic Setup
95-
96-
First, configure your RubyLLM client and create an MCP connection:
97-
9847
```ruby
99-
require 'ruby_llm/mcp'
100-
101-
# Configure RubyLLM
102-
RubyLLM.configure do |config|
103-
config.openai_api_key = "your-api-key"
104-
end
105-
106-
# Connect to an MCP server via SSE
107-
client = RubyLLM::MCP.client(
108-
name: "my-mcp-server",
109-
transport_type: :sse,
110-
config: {
111-
url: "http://localhost:9292/mcp/sse"
112-
}
113-
)
48+
# Add a server resource to chat context
49+
resource = client.resource('project_readme')
11450

115-
# Or connect via stdio
116-
client = RubyLLM::MCP.client(
117-
name: "my-mcp-server",
118-
transport_type: :stdio,
119-
config: {
120-
command: "node",
121-
args: ["path/to/mcp-server.js"],
122-
env: { "NODE_ENV" => "production" }
123-
}
124-
)
51+
chat = RubyLLM.chat(model: 'gpt-4o-mini')
52+
chat.with_resource(resource)
12553

126-
# Or connect via streamable HTTP
127-
client = RubyLLM::MCP.client(
128-
name: "my-mcp-server",
129-
transport_type: :streamable,
130-
config: {
131-
url: "http://localhost:8080/mcp",
132-
headers: { "Authorization" => "Bearer your-token" }
133-
}
134-
)
54+
puts chat.ask('Summarize this project in 5 bullets.')
13555
```
13656

137-
### Using MCP Tools with RubyLLM
138-
13957
```ruby
140-
# Get available tools from the MCP server
141-
tools = client.tools
142-
puts "Available tools:"
143-
tools.each do |tool|
144-
puts "- #{tool.name}: #{tool.description}"
145-
end
58+
# Execute a predefined MCP prompt with arguments
59+
prompt = client.prompt('code_review')
60+
chat = RubyLLM.chat(model: 'gpt-4o-mini')
14661

147-
# Create a chat session with MCP tools
148-
chat = RubyLLM.chat(model: "gpt-4")
149-
chat.with_tools(*client.tools)
62+
response = chat.ask_prompt(prompt, arguments: {
63+
language: 'ruby',
64+
focus: 'security'
65+
})
15066

151-
# Ask a question that will use the MCP tools
152-
response = chat.ask("Can you help me search for recent files in my project?")
15367
puts response
15468
```
15569

156-
### Manual Tool Execution
157-
158-
You can also execute MCP tools directly:
159-
16070
```ruby
161-
# Tools Execution
162-
tool = client.tool("search_files")
163-
164-
# Execute a specific tool
165-
result = tool.execute(
166-
name: "search_files",
167-
parameters: {
168-
query: "*.rb",
169-
directory: "/path/to/search"
71+
# Authenticate to a protected MCP server with browser OAuth
72+
client = RubyLLM::MCP.client(
73+
name: 'oauth-server',
74+
transport_type: :streamable,
75+
start: false,
76+
config: {
77+
url: 'https://mcp.example.com/mcp',
78+
oauth: { scope: 'mcp:read mcp:write' }
17079
}
17180
)
17281

173-
puts result
82+
client.oauth(type: :browser).authenticate
83+
client.start
17484
```
17585

176-
### Working with Resources
177-
178-
MCP servers can provide access to resources - structured data that can be included in conversations. Resources come in two types: normal resources and resource templates.
179-
180-
#### Normal Resources
181-
18286
```ruby
183-
# Get available resources from the MCP server
184-
resources = client.resources
185-
puts "Available resources:"
186-
resources.each do |resource|
187-
puts "- #{resource.name}: #{resource.description}"
87+
# Poll a long-running MCP task and fetch its final result
88+
task = client.task_get('task-123')
89+
90+
until task.completed? || task.failed? || task.cancelled?
91+
sleep((task.poll_interval || 250) / 1000.0)
92+
task = task.refresh
18893
end
18994

190-
# Access a specific resource by name
191-
file_resource = client.resource("project_readme")
192-
content = file_resource.content
193-
puts "Resource content: #{content}"
95+
if task.completed?
96+
payload = client.task_result(task.task_id)
97+
puts payload.dig('content', 0, 'text')
98+
else
99+
puts "Task ended with status: #{task.status}"
100+
end
101+
```
194102

195-
# Include a resource in a chat conversation for reference with an LLM
196-
chat = RubyLLM.chat(model: "gpt-4")
197-
chat.with_resource(file_resource)
103+
## Support At A Glance
198104

199-
# Or add a resource directly to the conversation
200-
file_resource.include(chat)
105+
- **Native MCP client implementation (`:ruby_llm`)** with full protocol support through `2025-11-25`
106+
- **Official MCP SDK adapter support (`:mcp_sdk`)** via the `mcp` gem for teams that prefer SDK-backed integration
107+
- **OAuth implementation** for authenticated streamable HTTP MCP servers
108+
- **Transports:** `stdio`, `sse`, `streamable` / `streamable_http`
109+
- **Core server features:** tools, resources, resource templates, prompts, notifications
110+
- **Advanced client features:** sampling, roots, progress tracking, human-in-the-loop, elicitation
111+
- **Task lifecycle APIs** (`tasks/list`, `tasks/get`, `tasks/result`, `tasks/cancel`) are experimental
201112

202-
response = chat.ask("Can you summarize this README file?")
203-
puts response
204-
```
113+
> [!WARNING]
114+
> MCP task support is experimental and subject to change in both the MCP spec and this gem's implementation.
205115
206-
#### Resource Templates
116+
## Install
207117

208-
Resource templates are parameterized resources that can be dynamically configured:
118+
Add to your Gemfile:
209119

210120
```ruby
211-
# Get available resource templates
212-
templates = client.resource_templates
213-
log_template = client.resource_template("application_logs")
214-
215-
# Use a template with parameters
216-
chat = RubyLLM.chat(model: "gpt-4")
217-
chat.with_resource_template(log_template, arguments: {
218-
date: "2024-01-15",
219-
level: "error"
220-
})
221-
222-
response = chat.ask("What errors occurred on this date?")
223-
puts response
224-
225-
# You can also get templated content directly
226-
content = log_template.to_content(arguments: {
227-
date: "2024-01-15",
228-
level: "error"
229-
})
230-
puts content
121+
gem 'ruby_llm-mcp'
231122
```
232123

233-
### Working with Prompts
234-
235-
MCP servers can provide predefined prompts that can be used in conversations:
124+
Optional (for `:mcp_sdk` adapter):
236125

237126
```ruby
238-
# Get available prompts from the MCP server
239-
prompts = client.prompts
240-
puts "Available prompts:"
241-
prompts.each do |prompt|
242-
puts "- #{prompt.name}: #{prompt.description}"
243-
prompt.arguments.each do |arg|
244-
puts " - #{arg.name}: #{arg.description} (required: #{arg.required})"
245-
end
246-
end
247-
248-
# Use a prompt in a conversation
249-
greeting_prompt = client.prompt("daily_greeting")
250-
chat = RubyLLM.chat(model: "gpt-4")
251-
252-
# Method 1: Ask prompt directly
253-
response = chat.ask_prompt(greeting_prompt, arguments: { name: "Alice", time: "morning" })
254-
puts response
255-
256-
# Method 2: Add prompt to chat and then ask
257-
chat.with_prompt(greeting_prompt, arguments: { name: "Alice", time: "morning" })
258-
response = chat.ask("Continue with the greeting")
127+
gem 'mcp', '~> 0.7'
259128
```
260129

261-
## Development
262-
263-
After checking out the repo, run `bundle` to install dependencies. Then, run `bundle exec rake` to run the tests. Tests currently use `bun` to run test MCP servers You can also run `bin/console` for an interactive prompt that will allow you to experiment.
264-
265-
There are also examples you you can run to verify the gem is working as expected.
130+
Then run:
266131

267132
```bash
268-
bundle exec ruby examples/tools/local_mcp.rb
133+
bundle install
269134
```
270135

136+
## Setup
137+
138+
1. Set your RubyLLM provider credentials (for example `OPENAI_API_KEY`).
139+
2. Start or access an MCP server.
140+
3. Create a `RubyLLM::MCP.client` and attach its tools/resources/prompts to chat flows.
141+
271142
## Contributing
272143

273-
We welcome contributions! Bug reports and pull requests are welcome on GitHub at https://github.com/patvice/ruby_llm-mcp.
144+
Bug reports and pull requests are welcome on [GitHub](https://github.com/patvice/ruby_llm-mcp).
274145

275146
## License
276147

docs/client/elicitation.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -31,9 +31,9 @@ When elicitation is enabled, MCP servers can send "elicitation" requests to your
3131
This is useful for servers that need user input or clarification during complex workflows.
3232

3333
{: .new }
34-
Elicitation is a new feature in MCP Protocol 2025-06-18.
34+
Elicitation was introduced in MCP Protocol 2025-06-18.
3535

36-
**Note:** Elicitation is only available for clients that support the `2025-06-18` protocol version.
36+
**Note:** Elicitation is available for clients using protocol version `2025-06-18` or newer.
3737

3838
## Basic Elicitation Configuration
3939

0 commit comments

Comments
 (0)