MCP Tools for Open WebUI
An MCP server for Open WebUI that provides tools for secure Python code execution, time, and SDXL image generation.
MCP Tools for Open WebUI
A streamlined Model Context Protocol (MCP) implementation designed for seamless integration with Open WebUI via mcpo proxy. This project provides secure code execution and time tools accessible through a beautiful chat interface.
šÆ What This Does
Transform your AI chat into a powerful development environment:
- š¬ Natural Language: Ask AI to run code, check time, install packages
- š”ļø Secure Execution: Sandboxed Python environment with safety controls
- š§ Easy Integration: One-command deployment with Docker Compose
- š Full Observability: Prometheus metrics + Grafana dashboards
ā” Quick Demo
After setup, you can chat with AI like this:
- "What time is it right now?" ā Gets current UTC time
- "Calculate fibonacci(10) in Python" ā Executes code safely
- "Install requests and fetch httpbin.org/json" ā Installs packages and runs code
šļø Architecture Overview
The system uses mcpo to bridge MCP tools with Open WebUI's OpenAPI interface:
Core Services
- š Open WebUI (Port 3001): Modern chat interface with tool integration
- ā” mcpo (Port 8080): MCP-to-OpenAPI proxy bridge
- š”ļø Sandbox (Port 8001): Secure Python code execution environment
- ā° Time Client (Port 8003): Time-related MCP tools
- šØ Diffusion API (Port 8000): SDXL image generation with style support
- š¼ļø Style Browser (Port 8081): Interactive style discovery interface
- š¤ Ollama (Port 11434): Local LLM inference with GPU acceleration
Infrastructure Services
- š Prometheus (Port 9090): Metrics collection
- š Grafana (Port 3000): Visualization dashboards
graph TB
User[š¤ User]
OpenWebUI[š Open WebUI:3001<br/>Chat Interface]
mcpo[ā” mcpo:8080<br/>MCP-to-OpenAPI Bridge]
subgraph "MCP Tools"
Sandbox[š”ļø Sandbox:8001<br/>Code Execution]
TimeClient[ā° Time Client:8003<br/>Time Tools]
DiffusionAPI[šØ Diffusion API:8000<br/>SDXL Image Generation]
end
StyleBrowser[š¼ļø Style Browser:8081<br/>Style Discovery]
Ollama[š¤ Ollama:11434<br/>LLM Inference]
User --> OpenWebUI
OpenWebUI --> mcpo
OpenWebUI --> Ollama
mcpo --> Sandbox
mcpo --> TimeClient
mcpo --> DiffusionAPI
StyleBrowser --> DiffusionAPI
style mcpo fill:#ff6b6b,stroke:#333,stroke-width:3px
style OpenWebUI fill:#4ecdc4,stroke:#333,stroke-width:3px
style DiffusionAPI fill:#9b59b6,stroke:#333,stroke-width:3px
š Component Documentation
Each component has detailed documentation with setup instructions, API specifications, and troubleshooting guides:
- š”ļø Sandbox Service - Secure Python code execution environment
- ā° Time Client - Time and timezone management service
- šØ Diffusion API - SDXL image generation with 200+ artistic styles
- š Style Browser - Interactive web interface for style discovery
- š Monitoring Stack - Prometheus & Grafana observability solution
š Quick Start
Prerequisites
- Docker and Docker Compose
- Git
- NVIDIA GPU (optional, for Ollama acceleration)
Installation
# Clone the repository
git clone <repository-url>
cd mcp_things2
# Create data directories for persistent storage
mkdir -p data/grafana data/prometheus data/open-webui
# Start all services
docker-compose up --build
š Access Your Tools
Once running, visit these URLs:
Service | URL | Purpose |
---|---|---|
Open WebUI | http://localhost:3001 | š¬ Main chat interface |
Style Browser | http://localhost:8081 | šØ Browse SDXL styles |
mcpo Proxy | http://localhost:8080 | ā” Tool API gateway |
Time Tools Docs | http://localhost:8080/time/docs | ā° Interactive API docs |
Sandbox Tools Docs | http://localhost:8080/sandbox/docs | š”ļø Code execution docs |
Diffusion API Docs | http://localhost:8000/docs | šØ Image generation API |
Ollama | http://localhost:11434 | š¤ LLM inference |
Prometheus | http://localhost:9090 | š Metrics collection |
Grafana | http://localhost:3000 | š Dashboards (admin/admin) |
š ļø Available Tools
ā° Time Tools
get_current_time
: Returns current UTC time in formatted string- Access: Available in chat or at
http://localhost:8080/time/docs
š”ļø Code Execution Tools
execute_python
: Execute Python code in secure sandbox environmentpip_install
: Install Python packages with security validation- Access: Available in chat or at
http://localhost:8080/sandbox/docs
šØ Image Generation Tools
generate_images
: Create AI images with SDXL using optional style templateslist_styles
: Browse hundreds of available artistic stylessuggest_styles
: Get AI-suggested styles based on your descriptionget_style
: Get details about a specific style- Style Browser: Interactive web interface at
http://localhost:8081
- Access: Available in chat or at
http://localhost:8000/docs
š Security Features
- Container isolation for all code execution
- Package name validation and blocklist
- Suspicious package detection and blocking
- Non-root execution environment
- Timeout protection and resource limits
š¬ Using the Chat Interface
Simply open http://localhost:3001 and start chatting! Examples:
Time Queries:
- "What time is it?"
- "Show me the current UTC time"
Code Execution:
- "Calculate the factorial of 8 in Python"
- "Create a list of prime numbers up to 50"
- "Show me a simple matplotlib plot"
Package Management:
- "Install the requests library"
- "Install numpy and create a random array"
- "Use pandas to create a DataFrame"
Image Generation:
- "Generate an image of a sunset over mountains"
- "Create a cinematic style image of a cyberpunk city"
- "What artistic styles are available for image generation?"
- "Suggest some styles for creating fantasy artwork"
- "Generate a portrait in the Fooocus Photograph style"
š§ Direct API Usage
You can also call tools directly via the mcpo API:
# Get current time
curl -X GET "http://localhost:8080/time/get_current_time" \
-H "Authorization: Bearer your-secret-key"
# Execute Python code
curl -X POST "http://localhost:8080/sandbox/execute" \
-H "Authorization: Bearer your-secret-key" \
-H "Content-Type: application/json" \
-d '{"code": "print(\"Hello, MCP World!\")"}'
# Install a package
curl -X POST "http://localhost:8080/sandbox/pip_install" \
-H "Authorization: Bearer your-secret-key" \
-H "Content-Type: application/json" \
-d '{"package": "requests", "version": "latest"}'
šļø Project Structure
mcp_things2/
āāā docker-compose.yml # Service orchestration
āāā mcpo-config.json # mcpo tool configuration
āāā sandbox/ # Secure code execution
ā āāā app/main.py # Sandbox FastAPI-MCP service
ā āāā requirements.txt # Python dependencies
ā āāā Dockerfile # Container definition
āāā time-client/ # Time tools
ā āāā app/main.py # Time FastAPI-MCP service
ā āāā requirements.txt # Python dependencies
ā āāā Dockerfile # Container definition
āāā data/ # Persistent data (host mounts)
ā āāā grafana/ # Grafana dashboards and data
ā āāā prometheus/ # Prometheus metrics data
ā āāā open-webui/ # Open WebUI user data
āāā monitoring/ # Monitoring configuration
ā āāā prometheus.yml # Prometheus scrape config
ā āāā grafana/ # Grafana provisioning
āāā workspace/ # Shared workspace volume
āļø Configuration
mcpo Configuration
The mcpo-config.json
file defines which MCP services to expose:
{
"mcpServers": {
"time": {
"type": "sse",
"url": "http://time-client:8003/mcp/sse"
},
"sandbox": {
"type": "sse",
"url": "http://sandbox:8001/mcp/sse"
}
}
}
Environment Variables
Key configuration options:
Open WebUI:
OPENAI_API_BASE_URL=http://ollama:11434/v1
OPENAI_API_KEY=ollama
ENABLE_OPENAI_API=true
mcpo:
- API key configured in docker-compose.yml
- Port and config file path
Ollama:
- Automatic GPU detection if available
- Models stored in
~/docker-data/ollama
Ollama Model Management
# Pull models via Ollama service
docker-compose exec ollama ollama pull llama3.1:latest
docker-compose exec ollama ollama list
docker-compose exec ollama ollama show llama3.1:latest
š Monitoring & Health
Health Checks
# Check all services
curl http://localhost:3001/health # Open WebUI
curl http://localhost:8080/health # mcpo
curl http://localhost:8001/health # Sandbox
curl http://localhost:8003/health # Time Client
curl http://localhost:11434/api/tags # Ollama
Metrics & Dashboards
- Prometheus: Collects metrics from all services
- Grafana: Provides pre-configured dashboards
- Service Health: Automatic dependency monitoring
- Tool Usage: Track tool calls and execution times
š”ļø Security Considerations
Code Execution Security
- Container Isolation: Each execution runs in isolated environment
- Package Validation: Blocks known malicious packages
- Resource Limits: CPU, memory, and time constraints
- Non-root Execution: All code runs as non-privileged user
- Network Controls: Limited external network access
API Security
- Authentication: API key required for mcpo access
- Input Validation: All inputs sanitized and validated
- Rate Limiting: Protection against abuse
- CORS Configuration: Controlled cross-origin access
š¢ Development & Deployment
Development Workflow
# Start in development mode
docker-compose up --build
# View logs
docker-compose logs -f sandbox
docker-compose logs -f time-client
# Rebuild specific service
docker-compose build sandbox
docker-compose up -d sandbox
# Access container for debugging
docker-compose exec sandbox bash
Adding New Tools
- Create new FastAPI-MCP service
- Add service to
docker-compose.yml
- Update
mcpo-config.json
with new endpoint - Restart mcpo to load new tools
Production Deployment
- Use environment-specific compose files
- Configure secrets management
- Set up proper logging and monitoring
- Consider Kubernetes for larger deployments
š® Roadmap
See IDEAS.md for planned features:
- Additional tool integrations (Jira, Mattermost, etc.)
- Enhanced security controls
- Multi-user support
- Custom model integrations
- Advanced monitoring and alerting
š¤ Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Test with the Docker Compose setup
- Submit a pull request
š Documentation
For more details, see:
- Sequence Diagrams - Detailed interaction flows
- Architecture Overview - Deep dive into design
- Docker Patterns - Container best practices
- Development Ideas - Future enhancements
š License
MIT License - see LICENSE file for details.
Built with ā¤ļø using Model Context Protocol, Open WebUI, and mcpo
Related Servers
xcodebuild
š Build iOS Xcode workspace/project and feed back errors to llm.
BioMCP
Enhances large language models with protein structure analysis capabilities, including active site analysis and disease-protein searches, by connecting to the RCSB Protein Data Bank.
eBPF MCP
A secure MCP server for eBPF, designed for AI integration, kernel introspection, and automation.
Baby-SkyNet
An autonomous memory management system for Claude AI, featuring multi-provider LLM integration and a persistent memory database.
Figma
Integrate Figma design data with AI coding tools using a local MCP server.
JSON Diff
A JSON diff tool to compare two JSON strings.
AI Agent Playwright
An AI agent for the Playwright MCP server, enabling automated web testing and interaction.
mcp-registry-mcp
Interact with an MCP registry to check health, list entries, and get server details.
Just Prompt
A unified interface for various Large Language Model (LLM) providers, including OpenAI, Anthropic, Google Gemini, Groq, DeepSeek, and Ollama.
Grafana
Search dashboards, investigate incidents and query datasources in your Grafana instance