Interact with the Timeplus real-time data platform for querying and managing data streams.
An MCP server for Timeplus.
generate_sql
to give LLM more knowledge about how to query Timeplus via SQLrun_sql
sql
(string): The SQL query to execute.readonly = 1
to ensure they are safe. If you want to run DDL or DML queries, you can set the environment variable TIMEPLUS_READ_ONLY
to false
.list_databases
list_tables
database
(string): The name of the database.list_kafka_topics
explore_kafka_topic
topic
(string): The name of the topic. message_count
(int): The number of messages to show, default to 1.create_kafka_stream
topic
(string): The name of the topic.connect_to_apache_iceberg
iceberg_db
(string): The name of the Iceberg database. aws_account_id
(int): The AWS account ID (12 digits). s3_bucket
(string): The S3 bucket name. aws_region
(string): The AWS region, default to "us-west-2". is_s3_table_bucket
(bool): Whether the S3 bucket is a S3 table bucket, default to False.First, ensure you have the uv
executable installed. If not, you can install it by following the instructions here.
~/Library/Application Support/Claude/claude_desktop_config.json
%APPDATA%/Claude/claude_desktop_config.json
{ "mcpServers": { "mcp-timeplus": { "command": "uvx", "args": ["mcp-timeplus"], "env": { "TIMEPLUS_HOST": "", "TIMEPLUS_PORT": "", "TIMEPLUS_USER": "", "TIMEPLUS_PASSWORD": "", "TIMEPLUS_SECURE": "false", "TIMEPLUS_VERIFY": "true", "TIMEPLUS_CONNECT_TIMEOUT": "30", "TIMEPLUS_SEND_RECEIVE_TIMEOUT": "30", "TIMEPLUS_READ_ONLY": "false", "TIMEPLUS_KAFKA_CONFIG": "{"bootstrap.servers":"a.aivencloud.com:28864", "sasl.mechanism":"SCRAM-SHA-256","sasl.username":"avnadmin", "sasl.password":"thePassword","security.protocol":"SASL_SSL","enable.ssl.certificate.verification":"false"}" } } } }
Update the environment variables to point to your own Timeplus service.
You can also try this MCP server with other MCP clients, such as 5ire.
test-services
directory run docker compose up -d
to start a Timeplus Proton server. You can also download it via curl https://install.timeplus.com/oss | sh
, then start with ./proton server
..env
file in the root of the repository.TIMEPLUS_HOST=localhost
TIMEPLUS_PORT=8123
TIMEPLUS_USER=default
TIMEPLUS_PASSWORD=
TIMEPLUS_SECURE=false
TIMEPLUS_VERIFY=true
TIMEPLUS_CONNECT_TIMEOUT=30
TIMEPLUS_SEND_RECEIVE_TIMEOUT=30
TIMEPLUS_READ_ONLY=false
TIMEPLUS_KAFKA_CONFIG={"bootstrap.servers":"a.aivencloud.com:28864", "sasl.mechanism":"SCRAM-SHA-256","sasl.username":"avnadmin", "sasl.password":"thePassword","security.protocol":"SASL_SSL","enable.ssl.certificate.verification":"false"}
uv sync
to install the dependencies. Then do source .venv/bin/activate
.mcp dev mcp_timeplus/mcp_server.py
to start the MCP server. Click the "Connect" button to connect the UI with the MCP server, then switch to the "Tools" tab to run the available tools.docker build -t mcp_timeplus .
.The following environment variables are used to configure the Timeplus connection:
TIMEPLUS_HOST
: The hostname of your Timeplus serverTIMEPLUS_USER
: The username for authenticationTIMEPLUS_PASSWORD
: The password for authenticationTIMEPLUS_PORT
: The port number of your Timeplus server
8443
if HTTPS is enabled, 8123
if disabledTIMEPLUS_SECURE
: Enable/disable HTTPS connection
"false"
"true"
for secure connectionsTIMEPLUS_VERIFY
: Enable/disable SSL certificate verification
"true"
"false"
to disable certificate verification (not recommended for production)TIMEPLUS_CONNECT_TIMEOUT
: Connection timeout in seconds
"30"
TIMEPLUS_SEND_RECEIVE_TIMEOUT
: Send/receive timeout in seconds
"300"
TIMEPLUS_DATABASE
: Default database to use
TIMEPLUS_READ_ONLY
: Enable/disable read-only mode
"true"
"false"
to enable DDL/DMLTIMEPLUS_KAFKA_CONFIG
: A JSON string for the Kafka configuration. Please refer to librdkafka configuration or take the above example as a reference.A local, high-performance memory server for AI agents, built with SQLite, vector embeddings, and a knowledge graph. Packaged for npm and Docker.
Enables secure interaction with MySQL databases, allowing AI assistants to list tables, read data, and execute SQL queries through a controlled interface.
Access and manage ActiveCampaign data through the CData JDBC Driver.
Perform database actions on Amazon Redshift via its Data API.
Provides natural language access to relational databases using advanced language models, supporting multiple database types.
Provides access to supOS open APIs for querying topic structures, real-time and historical data, and executing SQL queries.
A server providing tools for querying and managing a MySQL database.
A runtime-configurable MCP server that turns a Supabase project into an AI-compatible tool interface.
Interact with Neon Postgres databases using natural language to manage projects, branches, queries, and migrations via the Neon API.
Manages personal knowledge using a local Neo4j container, with data imported from JSON files.