A collection of Model Context Protocol (MCP) servers for various tasks and integrations, supporting both Python and Node.js environments.
A comprehensive collection of Model Context Protocol (MCP) servers for AI agents, providing advanced capabilities for documentation, authentication, data analysis, image generation, and more.
# Clone the repository
git clone https://github.com/aegntic/aegntic-mcp.git
cd aegntic-mcp
# Run the automated setup script
./setup.sh
# Configure your Claude Desktop or other MCP client
We maintain a unified MCP configuration approach that works seamlessly across both Claude Desktop and Claude Code. This configuration lives in:
~/.config/Claude/claude_desktop_config.json
~/.config/claude-code/mcp_servers.json
~/.mcp-servers/
(for locally developed servers)Server | Type | Description | Installation Path |
---|---|---|---|
Aegntic Knowledge Engine | Local/UV | Zero-cost unified knowledge engine with web crawling, RAG, memory graph, task management, and documentation context (20 tools) | servers/aegntic-knowledge-engine |
AI Collaboration Hub | Local/UV | AI-powered collaboration tools with OpenRouter integration | ~/.mcp-servers/ai-collaboration-hub |
Claude Export MCP | NPM | Export Claude Desktop projects, conversations, and artifacts to Markdown format | npx @aegntic/claude-export-mcp |
Firebase Studio MCP | NPM | Complete access to Firebase and Google Cloud services | npx @aegntic/firebase-studio-mcp |
n8n MCP | NPM | Limitless n8n workflow automation with no restrictions | npx @leonardsellem/n8n-mcp-server |
Docker MCP | UVX | Comprehensive Docker container and image management with Docker Hub integration | uvx mcp-server-docker |
Just Prompt | Local/UV | Advanced prompt orchestration and model routing | /home/tabs/ae-co-system/CLAEM/just-prompt-orchestration/just-prompt |
Quick Data | Local/UV | Fast data processing and analysis tools | /home/tabs/ae-co-system/DAILYDOCO/quick-data-mcp |
DailyDoco Pro | Local/Node | Professional documentation and project management | ~/.mcp-servers/dailydoco-pro |
Aegnt-27 | Local/Node | Advanced AI agent capabilities | ~/.mcp-servers/aegnt-27 |
Aegnt-27-lib | Local/Node | AI agent library and utilities | ~/.mcp-servers/aegnt-27-lib |
These servers are included in our unified configuration:
Server | Type | Description |
---|---|---|
filesystem | NPM | File system operations |
memory | NPM | Memory and knowledge management |
context7 | NPM | Context management for AI conversations |
puppeteer | NPM | Browser automation with Playwright |
sequentialthinking | NPM | Sequential thinking and reasoning tools |
github | Smithery | GitHub integration and operations |
exa | Smithery | Advanced search capabilities |
smithery | Smithery | Smithery toolbox utilities |
desktop-commander | Smithery | Desktop automation and control |
ppick | UVX | Process picking and management |
notionApi | NPM | Notion API integration |
supabase | NPM | Supabase database integration |
First, ensure you have the global MCP servers directory:
mkdir -p ~/.mcp-servers
# Install UV package manager
curl -LsSf https://astral.sh/uv/install.sh | sh
# Ensure Node.js 14+ is installed
node --version
npm --version
Create or update ~/.config/Claude/claude_desktop_config.json
:
{
"mcpServers": {
"dailydoco-pro": {
"command": "node",
"args": ["/path/to/aegntic-MCP/dailydoco-pro/dist/index.js"],
"env": {
"USER_EMAIL": "your-email@example.com"
}
},
"aegnt-27": {
"command": "node",
"args": ["/path/to/aegntic-MCP/aegnt-27/dist/index.js"],
"env": {
"USER_EMAIL": "your-email@example.com"
}
},
"comfyui": {
"command": "node",
"args": ["/path/to/aegntic-MCP/comfyui-mcp/dist/index.js"],
"env": {
"COMFYUI_HOST": "http://localhost:8188",
"USER_EMAIL": "your-email@example.com"
}
},
"aegntic-auth": {
"command": "node",
"args": ["/path/to/aegntic-MCP/aegntic-auth/dist/index.js"],
"env": {
"SUPABASE_URL": "your-supabase-url",
"SUPABASE_ANON_KEY": "your-supabase-key"
}
},
"graphiti": {
"command": "uv",
"args": [
"run", "--directory", "/path/to/aegntic-MCP/graphiti-mcp",
"python", "graphiti_mcp_server.py", "--transport", "stdio"
],
"env": {
"NEO4J_URI": "bolt://localhost:7687",
"NEO4J_USER": "neo4j",
"NEO4J_PASSWORD": "your-password",
"OPENAI_API_KEY": "your-openai-key"
}
},
"n8n-pro": {
"command": "node",
"args": ["/path/to/aegntic-MCP/n8n-pro/dist/mcp/index.js"],
"env": {
"N8N_API_URL": "http://localhost:5678",
"N8N_API_KEY": "your-n8n-api-key"
}
},
"just-prompt": {
"command": "uv",
"args": [
"run", "--directory", "/path/to/aegntic-MCP/just-prompt",
"just-prompt"
],
"env": {
"OPENROUTER_API_KEY": "your-openrouter-key"
}
},
"quick-data": {
"command": "uv",
"args": [
"run", "--directory", "/path/to/aegntic-MCP/quick-data",
"python", "main.py"
]
},
// Additional NPM-based servers
"filesystem": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "/home/tabs"],
"env": {}
},
// UVX-based servers
"docker": {
"command": "uvx",
"args": ["mcp-server-docker"],
"env": {}
},
// Additional integrated servers
"ai-collaboration-hub": {
"command": "uv",
"args": ["run", "python", "-m", "ai_collaboration_hub.server"],
"cwd": "/home/tabs/ae-co-system/aegntic-MCP/servers/ai-collaboration-hub",
"env": {
"OPENROUTER_API_KEY": "your-key-here"
}
}
}
}
}
Create or update ~/.config/claude-code/mcp_servers.json
with the same structure.
For servers like DailyDoco Pro, Aegnt-27:
cd ~/.mcp-servers/server-name
npm install
npm run build
For servers like AI Collaboration Hub:
cd /path/to/server
uv sync
uv run python -m module_name.server
Each MCP server has its own development environment:
cd server-directory
npm install
npm run build
npm start
cd server-directory
uv sync
uv run python main.py # or specific entry point
Server | Language | Capabilities |
---|---|---|
dailydoco-pro | TypeScript | Project analysis, video capture, AI test audiences, brand management |
aegnt-27 | TypeScript | Mouse/typing authenticity, AI detection resistance, audio processing |
comfyui-mcp | TypeScript | Image generation, video creation, background removal, logo design |
aegntic-auth | TypeScript | User registration, Stripe payments, email campaigns, usage tracking |
graphiti-mcp | Python | Knowledge graphs, memory storage, entity extraction, temporal data |
n8n-pro | TypeScript | n8n documentation (525+ nodes), workflow validation, node information, AI tool detection |
just-prompt | Python | Multi-LLM prompting, model comparison, CEO decision making |
quick-data | Python | Data analysis, visualization, statistical insights, ML features |
The Model Context Protocol (MCP) is a standard for extending the capabilities of AI assistants like Claude by giving them access to external tools and services. These servers implement the MCP standard to provide specialized functionality that can be used directly from within Claude conversations.
Each server in this repository can be installed and run independently. See the README in each server's directory for specific installation and usage instructions.
~/.mcp-servers/
All servers include built-in security features:
Each server includes comprehensive documentation:
See individual server README files for detailed information.
This project is licensed under the MIT License - see individual server LICENSE files for details.
Built with ❤️ for the AI agent ecosystem
Multimodal MCP server for generating images, audio, and text with no authentication required
Fulcra Context MCP server for accessing your personal health, workouts, sleep, location, and more, all privately. Built around Context by Fulcra.
MCP Server for DealX platform
Access real-time gaming data across popular titles like League of Legends, TFT, and Valorant, offering champion analytics, esports schedules, meta compositions, and character statistics.
Send Nano currency and retrieve account and block information using the Nano node RPC.
Interact with EduBase, a comprehensive e-learning platform with advanced quizzing, exam management, and content organization capabilities
Detects Chinese mobile phone carriers, including China Mobile, China Unicom, China Telecom, and virtual operators.
An MCP server for fetching verifiable random numbers from the drand network.
Provides AI assistants with comprehensive access to a Plex Media Server.
Provides AI agents with read-only access to SignalK marine data systems, enabling queries of vessel navigation data, AIS targets, and system alarms.