Tangerine
An MCP server for Tangerine, the Convo AI assistant backend.
tangerine-mcp
MCP server for Tangerine (Convo AI assistant backend): https://github.com/RedHatInsights/tangerine-backend
Running with Podman or Docker
You can run the tangerine-mcp server in a container using Podman or Docker. Make sure you have a valid OpenShift token:
Example configuration for running with Podman:
{
"mcpServers": {
"tangerine": {
"command": "podman",
"args": [
"run",
"-i",
"--rm",
"-e", "TANGERINE_URL",
"-e", "TANGERINE_TOKEN",
"-e", "MCP_TRANSPORT",
"quay.io/maorfr/tangerine-mcp:latest"
],
"env": {
"TANGERINE_URL": "https://tangerine.example.com",
"TANGERINE_TOKEN": "REDACTED",
"MCP_TRANSPORT": "stdio"
}
}
}
}
Replace REDACTED with the OpenShift token.
Verwandte Server
ChatMCP
A cross-platform AI chat client supporting desktop, mobile, and web platforms.
Telephony MCP Server
Make voice calls and send SMS messages using the Vonage API.
mpc-bridge
http stream to stdin/stdout and back
Confluence
Interact with Confluence to execute CQL queries, retrieve page content, and update pages.
Activitysmith
This MCP server exposes ActivitySmith notifications and live activity tools.
Gmail MCP Server
An MCP server that enables AI models to interact directly with the Gmail API to manage emails.
Multi Chat MCP Server (Google Chat)
Connect AI assistants like Cursor to Google Chat and beyond — enabling smart, extensible collaboration across chat platforms.
A2A MCP Server
A bridge server connecting Model Context Protocol (MCP) with Agent-to-Agent (A2A) protocol.
WeCom Bot
Sends various types of messages to a WeCom (WeChat Work) group robot.
Ntfy MCP Server
Send push notifications via the ntfy service, enabling LLMs and AI agents to notify your devices.