LLM Chat Assistant
A chat assistant that integrates an MCP client with an LLM and other external MCP servers.
LLM Chat Assistant
This project is a chat assistant application that integrates MCP client (MCP host) with an LLM (Large Language Model) and external tools (MCP Servers). It allows users to interact with the LLM, which can either provide direct answers or call external tools (MCP servers) to process user requests.
Supported modes
- stdio Mode
- http Mode
Features
- LLM Integration: Communicates with an LLM using the
pydantic-ailibrary. - Tool Execution: Supports external tools - MCP servers that can be executed based on user input.
- Structured Responses: Handles structured responses from the LLM, including tool calls and direct answers.
- Server Management: Manages multiple MCP servers for tool execution.
Requirements
- Python 3.13 or higher
- MCP server(s) configured for tool execution
- LLM used local installed Ollama with qwen3:0.6b - You can change it if needed (https://ollama.com - instruction how to.)
- .env LLM_API_KEY=your-api-key-here if exteral LLM used
Installation
To execute demo MCP servers from 'mcp-servers' folder 1.FatMCP - https://github.com/modelcontextprotocol/python-sdk?tab=readme-ov-file#adding-mcp-to-your-python-project
- More information about FastMCP -> https://gofastmcp.com/getting-started/welcome
- Clone the repository:
git clone https://github.com/Rommagcom/agentic_ai_mcp.git cd agentic_ai_mcp pip install -r requirements.txt python mcp_host_client.py - To test http Mode:
- run file from folder mcp-servers/test_http_server.py - python mcp-servers/test_http_server.py
Interaction Example
-
You: echo test (After this request LLM determine to call neccessery Tool from MCP server)
-
Assistant: The result of the echo test is a text containing the message "This is echo test test". (Direct answer from MCP server)
-
You: How is the weather in Phuket ?
-
Assistant: The weather in Phuket is currently being retrieved via the API, with the mock response indicating it's a simulated result.
Add your MCP server
1. Add to mcp-server folder new .py file like in examples echo.py or weater_server.py or test_http_server.py
2. Add section to servers_config.json like where 'echo' is tool name in .py file -> @mcp.tool(description="A simple echo tool", name="echo")
"args": ["mcp-servers/weather_server.py"] full server file path
```Example:
{
"mcpServers": {
"echo": {
"command": "python",
"args": ["mcp-servers/echo.py"]
},
"weather_server": {
"command": "python",
"args": ["mcp-servers/weather_server.py"]
}
}
}
For http use case add http(s) MCP server url to config as: "url": "http://127.0.0.1:9000/mcp" and server name like "test_http" as in :
```Example:
{
"mcpServers": {
"test_http": {
"url": "http://127.0.0.1:9000/mcp"
}
}
}
관련 서버
Microsoft Teams MCP
Interact with Microsoft Teams to read, create, and reply to messages, and mention members.
TIDAL MCP: My Custom Picks
Personalized music recommendations and playlist management for TIDAL, powered by its API and LLM filtering.
oVice
Manage oVice workspaces, groups, users, and send notifications through the oVice API.
FeedMansion Social Media MCP
Schedule and queue text, image, and video posts to many social networks, including LinkedIn, X, Bluesky, Instagram, Facebook, Threads, Tiktok.
Bouyomi-chan MCP Server (Node.js)
A Node.js MCP server for the Japanese text-to-speech software Bouyomi-chan.
any-chat-completions-mcp
Chat with any other OpenAI SDK Compatible Chat Completions API, like Perplexity, Groq, xAI and more
Slack MCP Server
A server for integrating with Slack, enabling communication and automation within your workspace.
Gmail
Query live Gmail data using LLMs via CData's read-only MCP server.
IMAP MCP
An IMAP Model Context Protocol (MCP) server to expose IMAP operations as tools for AI assistants.
Apple Mail MCP
Fast MCP server for Apple Mail with batch JXA (87x faster) and FTS5 search index (700-3500x faster).