A unified interface for various chat AI models including OpenAI, MistralAI, Anthropic, and Google AI, requiring vendor API keys.
Send requests to OpenAI, MistralAI, Anthropic, xAI, Google AI, DeepSeek, Alibaba, Inception using MCP protocol via tool or predefined prompts. Vendor API key required
The server implements one tool:
unichat
: Send a request to unichat
code_review
code
(string, required): The code to review"document_code
code
(string, required): The code to comment"explain_code
code
(string, required): The code to explain"code_rework
changes
(string, optional): The changes to apply"code
(string, required): The code to rework"On MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
Supported Models:
A list of currently supported models to be used as
"SELECTED_UNICHAT_MODEL"
may be found here. Please make sure to add the relevant vendor API key as"YOUR_UNICHAT_API_KEY"
Example:
"env": {
"UNICHAT_MODEL": "gpt-4o-mini",
"UNICHAT_API_KEY": "YOUR_OPENAI_API_KEY"
}
Development/Unpublished Servers Configuration
"mcpServers": {
"unichat-mcp-server": {
"command": "uv",
"args": [
"--directory",
"{{your source code local directory}}/unichat-mcp-server",
"run",
"unichat-mcp-server"
],
"env": {
"UNICHAT_MODEL": "SELECTED_UNICHAT_MODEL",
"UNICHAT_API_KEY": "YOUR_UNICHAT_API_KEY"
}
}
}
Published Servers Configuration
"mcpServers": {
"unichat-mcp-server": {
"command": "uvx",
"args": [
"unichat-mcp-server"
],
"env": {
"UNICHAT_MODEL": "SELECTED_UNICHAT_MODEL",
"UNICHAT_API_KEY": "YOUR_UNICHAT_API_KEY"
}
}
}
To install Unichat for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install unichat-mcp-server --client claude
To prepare the package for distribution:
rm -rf dist
uv sync
uv build
This will create source and wheel distributions in the dist/
directory.
uv publish --token {{YOUR_PYPI_API_TOKEN}}
Since MCP servers run over stdio, debugging can be challenging. For the best debugging experience, we strongly recommend using the MCP Inspector.
You can launch the MCP Inspector via npm
with this command:
npx @modelcontextprotocol/inspector uv --directory {{your source code local directory}}/unichat-mcp-server run unichat-mcp-server
Upon launching, the Inspector will display a URL that you can access in your browser to begin debugging.
Access the Waroom API through the Model Context Protocol.
A server for the Dixa API, enabling management of conversations and tags.
A high-performance MCP server for analyzing Intercom conversations with fast, local access via caching and background sync.
A news feed server for aggregating news from various sources.
Sends desktop notifications with sound when agent tasks are completed.
Interact with any other SaaS applications on behalf of your customers.
Enables AI assistants to send push notifications via the Pushover service.
Query live Gmail data using LLMs via CData's read-only MCP server.
An MCP server for interacting with Slack workspaces using user tokens, without requiring bots or special permissions.
Personalized music recommendations and playlist management for TIDAL, powered by its API and LLM filtering.