Ruby MCP Client
A Ruby client for the Model Context Protocol (MCP), enabling integration with external tools and services via a standardized protocol.
ruby-mcp-client
A Ruby client for the Model Context Protocol (MCP), enabling integration with external tools and services via a standardized protocol.
Installation
# Gemfile
gem 'ruby-mcp-client'
bundle install
# or
gem install ruby-mcp-client
Overview
MCP enables AI assistants to discover and invoke external tools via different transport mechanisms:
- stdio - Local processes implementing the MCP protocol
- SSE - Server-Sent Events with streaming support
- HTTP - Simple request/response (non-streaming)
- Streamable HTTP - HTTP POST with SSE-formatted responses
Built-in API conversions: to_openai_tools(), to_anthropic_tools(), to_google_tools()
MCP Protocol Support
Implements MCP 2025-11-25 specification:
- Tools: list, call, streaming, annotations (hint-style), structured outputs, title
- Prompts: list, get with parameters
- Resources: list, read, templates, subscriptions, pagination, ResourceLink content
- Elicitation: Server-initiated user interactions (stdio, SSE, Streamable HTTP)
- Roots: Filesystem scope boundaries with change notifications
- Sampling: Server-requested LLM completions with modelPreferences
- Completion: Autocomplete for prompts/resources with context
- Logging: Server log messages with level filtering
- Tasks: Structured task management with progress tracking
- Audio: Audio content type support
- OAuth 2.1: PKCE, server discovery, dynamic registration
Quick Connect API (Recommended)
The simplest way to connect to an MCP server:
require 'mcp_client'
# Auto-detect transport from URL
client = MCPClient.connect('http://localhost:8000/sse') # SSE
client = MCPClient.connect('http://localhost:8931/mcp') # Streamable HTTP
client = MCPClient.connect('npx -y @modelcontextprotocol/server-filesystem /home') # stdio
# With options
client = MCPClient.connect('http://api.example.com/mcp',
headers: { 'Authorization' => 'Bearer TOKEN' },
read_timeout: 60,
retries: 3,
logger: Logger.new($stdout)
)
# Multiple servers
client = MCPClient.connect(['http://server1/mcp', 'http://server2/sse'])
# Force specific transport
client = MCPClient.connect('http://custom.com/api', transport: :streamable_http)
# Use the client
tools = client.list_tools
result = client.call_tool('example_tool', { param: 'value' })
client.cleanup
Transport Detection:
| URL Pattern | Transport |
|---|---|
Ends with /sse | SSE |
Ends with /mcp | Streamable HTTP |
stdio://command or Array | stdio |
npx, node, python, etc. | stdio |
| Other HTTP URLs | Auto-detect (Streamable HTTP → SSE → HTTP) |
Working with Tools, Prompts & Resources
# Tools
tools = client.list_tools
result = client.call_tool('tool_name', { param: 'value' })
result = client.call_tool('tool_name', { param: 'value' }, server: 'server_name')
# Batch tool calls
results = client.call_tools([
{ name: 'tool1', parameters: { key: 'value' } },
{ name: 'tool2', parameters: { key: 'value' }, server: 'specific_server' }
])
# Streaming (SSE/Streamable HTTP)
client.call_tool_streaming('tool', { param: 'value' }).each do |chunk|
puts chunk
end
# Prompts
prompts = client.list_prompts
result = client.get_prompt('greeting', { name: 'Alice' })
# Resources
result = client.list_resources
contents = client.read_resource('file:///example.txt')
contents.each do |content|
puts content.text if content.text?
data = Base64.decode64(content.blob) if content.binary?
end
MCP 2025-11-25 Features
Tool Annotations
tool = client.find_tool('delete_user')
# Hint-style annotations (MCP 2025-11-25)
tool.read_only_hint? # Defaults to true; tool does not modify environment
tool.destructive_hint? # Defaults to false; tool may perform destructive updates
tool.idempotent_hint? # Defaults to false; repeated calls have no additional effect
tool.open_world_hint? # Defaults to true; tool may interact with external entities
# Legacy annotations
tool.read_only? # Safe to execute?
tool.destructive? # Warning: destructive operation
tool.requires_confirmation? # Needs user confirmation
Structured Outputs
tool = client.find_tool('get_weather')
tool.structured_output? # Has output schema?
tool.output_schema # JSON Schema for output
result = client.call_tool('get_weather', { location: 'SF' })
data = result['structuredContent'] # Type-safe structured data
Roots
# Set filesystem scope boundaries
client.roots = [
{ uri: 'file:///home/user/project', name: 'Project' },
{ uri: 'file:///var/log', name: 'Logs' }
]
# Access current roots
client.roots
Sampling (Server-requested LLM completions)
# Configure handler when creating client
client = MCPClient.connect('http://server/mcp',
sampling_handler: ->(messages, model_prefs, system_prompt, max_tokens) {
# Process server's LLM request
{
'model' => 'gpt-4',
'stopReason' => 'endTurn',
'role' => 'assistant',
'content' => { 'type' => 'text', 'text' => 'Response here' }
}
}
)
Completion (Autocomplete)
result = client.complete(
ref: { type: 'ref/prompt', name: 'greeting' },
argument: { name: 'name', value: 'A' }
)
# => { 'values' => ['Alice', 'Alex'], 'total' => 100, 'hasMore' => true }
Logging
# Set log level
client.log_level = 'debug' # debug/info/notice/warning/error/critical
# Handle log notifications
client.on_notification do |server, method, params|
if method == 'notifications/message'
puts "[#{params['level']}] #{params['logger']}: #{params['data']}"
end
end
Elicitation (Server-initiated user interactions)
client = MCPClient::Client.new(
mcp_server_configs: [MCPClient.stdio_config(command: 'python server.py')],
elicitation_handler: ->(message, schema) {
puts "Server asks: #{message}"
# Return: { 'action' => 'accept', 'content' => { 'field' => 'value' } }
# Or: { 'action' => 'decline' } or { 'action' => 'cancel' }
}
)
Advanced Configuration
For more control, use create_client with explicit configs:
client = MCPClient.create_client(
mcp_server_configs: [
MCPClient.stdio_config(command: 'npx server', name: 'local'),
MCPClient.sse_config(
base_url: 'https://api.example.com/sse',
headers: { 'Authorization' => 'Bearer TOKEN' },
read_timeout: 30, ping: 10, retries: 3
),
MCPClient.http_config(
base_url: 'https://api.example.com',
endpoint: '/rpc',
headers: { 'Authorization' => 'Bearer TOKEN' }
),
MCPClient.streamable_http_config(
base_url: 'https://api.example.com/mcp',
read_timeout: 60, retries: 3
)
],
logger: Logger.new($stdout)
)
# Or load from JSON file
client = MCPClient.create_client(server_definition_file: 'servers.json')
Faraday Customization
MCPClient.http_config(base_url: 'https://internal.company.com') do |faraday|
faraday.ssl.cert_store = custom_cert_store
faraday.ssl.verify = true
end
Server Definition JSON
{
"mcpServers": {
"filesystem": {
"type": "stdio",
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "/home"]
},
"api": {
"type": "streamable_http",
"url": "https://api.example.com/mcp",
"headers": { "Authorization": "Bearer TOKEN" }
}
}
}
AI Integration Examples
OpenAI
require 'mcp_client'
require 'openai'
mcp = MCPClient.connect('npx -y @modelcontextprotocol/server-filesystem .')
tools = mcp.to_openai_tools
client = OpenAI::Client.new(api_key: ENV['OPENAI_API_KEY'])
response = client.chat.completions.create(
model: 'gpt-4',
messages: [{ role: 'user', content: 'List files' }],
tools: tools
)
Anthropic
require 'mcp_client'
require 'anthropic'
mcp = MCPClient.connect('npx -y @modelcontextprotocol/server-filesystem .')
tools = mcp.to_anthropic_tools
client = Anthropic::Client.new(access_token: ENV['ANTHROPIC_API_KEY'])
# Use tools with Claude API
RubyLLM
require 'mcp_client'
require 'ruby_llm'
RubyLLM.configure { |c| c.openai_api_key = ENV['OPENAI_API_KEY'] }
mcp = MCPClient.connect('http://localhost:8931/mcp') # Playwright MCP
# Wrap each MCP tool as a RubyLLM tool
tools = mcp.list_tools.map do |t|
tool_name = t.name
Class.new(RubyLLM::Tool) do
description t.description
params t.schema
define_method(:name) { tool_name }
define_method(:execute) { |**args| mcp.call_tool(tool_name, args) }
end.new
end
chat = RubyLLM.chat(model: 'gpt-4o-mini')
tools.each { |tool| chat.with_tool(tool) }
response = chat.ask('Navigate to google.com and tell me the page title')
See examples/ for complete implementations:
ruby_openai_mcp.rb,openai_ruby_mcp.rb- OpenAI integrationruby_anthropic_mcp.rb- Anthropic integrationgemini_ai_mcp.rb- Google Vertex AI integrationruby_llm_mcp.rb- RubyLLM integration (OpenAI provider)
OAuth 2.1 Authentication
require 'mcp_client/auth/browser_oauth'
oauth = MCPClient::Auth::OAuthProvider.new(
server_url: 'https://api.example.com/mcp',
redirect_uri: 'http://localhost:8080/callback',
scope: 'mcp:read mcp:write'
)
browser_oauth = MCPClient::Auth::BrowserOAuth.new(oauth)
token = browser_oauth.authenticate # Opens browser, handles callback
client = MCPClient::Client.new(
mcp_server_configs: [{
type: 'streamable_http',
base_url: 'https://api.example.com/mcp',
oauth_provider: oauth
}]
)
Features: PKCE, server discovery (.well-known), dynamic registration, token refresh.
See OAUTH.md for full documentation.
Server Notifications
client.on_notification do |server, method, params|
case method
when 'notifications/tools/list_changed'
client.clear_cache # Auto-handled
when 'notifications/message'
puts "Log: #{params['data']}"
when 'notifications/roots/list_changed'
puts "Roots changed"
end
end
Session Management
Both HTTP and Streamable HTTP transports automatically handle session-based servers:
- Session capture: Extracts
Mcp-Session-Idfrom initialize response - Session persistence: Includes session header in subsequent requests
- Session termination: Sends DELETE request during cleanup
- Resumability (Streamable HTTP): Tracks event IDs for message replay
No configuration required - works automatically.
Server Compatibility
Works with any MCP-compatible server:
- @modelcontextprotocol/server-filesystem
- @playwright/mcp
- FastMCP
- Custom servers implementing MCP protocol
FastMCP Example
# Start server
python examples/echo_server_streamable.py
# Connect and use
client = MCPClient.connect('http://localhost:8931/mcp')
tools = client.list_tools
result = client.call_tool('echo', { message: 'Hello!' })
Requirements
- Ruby >= 3.2.0
- No runtime dependencies
License
Available as open source under the MIT License.
Contributing
Bug reports and pull requests welcome at https://github.com/simonx1/ruby-mcp-client.
Máy chủ liên quan
Alpha Vantage MCP Server
nhà tài trợAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
MCP Proxy
A proxy server for MCP requests, supporting SSE and stdio transports.
Unified MCP Client Library
An open-source library to connect any LLM to any MCP server, enabling the creation of custom agents with tool access.
Flux ImageGen MCP Server
An MCP server for generating images using the Pollinations AI API.
Azure DevOps
Manage Azure DevOps projects, work items, builds, and releases.
AI Agent Timeline MCP Server
A timeline tool for AI agents to post their thoughts and progress while working.
Botoi MCP Server
150+ developer utility APIs via MCP. Hashing, encoding, DNS lookup, IP geolocation, QR codes, JWT tools, URL shortening, email validation, screenshot capture, and more.
Website Generator MCP
An example MCP server designed for deployment on Cloudflare Workers, supporting both remote and local setups.
Fyers MCP Server
An MCP server for the Fyers API v3, featuring automated OAuth authentication.
Remote MCP Server (Authless)
A remote MCP server for Cloudflare Workers, authless by default with optional token-based authentication.
grasp-mcp-server a
36 tools for dependency graphs, architecture analysis, security scanning, refactoring, and CI tracking for LLM agents