Execute KQL queries using Azure authentication. Requires Azure CLI login.
AI-Powered KQL Query Execution with Intelligent Schema Memory
A Model Context Protocol (MCP) server that provides intelligent KQL (Kusto Query Language) query execution with AI-powered schema caching and context assistance for Azure Data Explorer clusters.
graph TD
A[š¤ User Submits KQL Query] --> B{š Query Validation}
B -->|ā Invalid| C[š Syntax Error Response]
B -->|ā
Valid| D[š§ Load Schema Context]
D --> E{š¾ Schema Cache Available?}
E -->|ā
Yes| F[ā” Load from Memory]
E -->|ā No| G[š Discover Schema]
F --> H[šÆ Execute Query]
G --> I[š¾ Cache Schema + AI Context]
I --> H
H --> J{šÆ Query Success?}
J -->|ā Error| K[šØ Enhanced Error Message]
J -->|ā
Success| L[š Process Results]
L --> M[šØ Generate Visualization]
M --> N[š¤ Return Results + Context]
K --> O[š” AI Suggestions]
O --> N
style A fill:#e1f5fe
style N fill:#e8f5e8
style K fill:#ffebee
graph TD
A[š¤ User Requests Schema Discovery] --> B[š Connect to Cluster]
B --> C[š Enumerate Databases]
C --> D[š Discover Tables]
D --> E[š Get Table Schemas]
E --> F[š¤ AI Analysis]
F --> G[š Generate Descriptions]
G --> H[š¾ Store in Memory]
H --> I[š Update Statistics]
I --> J[ā
Return Summary]
style A fill:#e1f5fe
style J fill:#e8f5e8
az login
)git clone https://github.com/4R9UN/mcp-kql-server.git && cd mcp-kql-server && pip install -e .
pip install mcp-kql-server
That's it! The server automatically:
%APPDATA%\KQL_MCP
(Windows) or ~/.local/share/KQL_MCP
(Linux/Mac)Add to your Claude Desktop MCP settings file (mcp_settings.json
):
Location:
%APPDATA%\Claude\mcp_settings.json
~/Library/Application Support/Claude/mcp_settings.json
~/.config/Claude/mcp_settings.json
{
"mcpServers": {
"mcp-kql-server": {
"command": "python",
"args": ["-m", "mcp_kql_server"],
"env": {}
}
}
}
Add to your VSCode MCP configuration:
Settings.json location:
%APPDATA%\Code\User\settings.json
~/Library/Application Support/Code/User/settings.json
~/.config/Code/User/settings.json
{
"mcp.servers": {
"mcp-kql-server": {
"command": "python",
"args": ["-m", "mcp_kql_server"],
"cwd": null,
"env": {}
}
}
}
Add to your Roo-code MCP settings:
MCP Settings location:
mcp_settings.json
{
"mcpServers": {
"kql-server": {
"command": "python",
"args": ["-m", "mcp_kql_server"],
"env": {},
"description": "KQL Server for Azure Data Explorer queries with AI assistance"
}
}
}
For any MCP-compatible application:
# Command to run the server
python -m mcp_kql_server
# Server provides these tools:
# - kql_execute: Execute KQL queries with AI context
# - kql_schema_memory: Discover and cache cluster schemas
You can customize the server behavior with environment variables:
{
"mcpServers": {
"mcp-kql-server": {
"command": "python",
"args": ["-m", "mcp_kql_server"],
"env": {
}
}
}
}
az login
python -m mcp_kql_server
The server starts immediately with:
%APPDATA%\KQL_MCP\cluster_memory
The server provides two main tools:
kql_execute
- Execute KQL Queries with AI Context
kql_schema_memory
- Discover and Cache Cluster Schemas
Ask your MCP client (like Claude):
"Execute this KQL query against the help cluster:
cluster('help.kusto.windows.net').database('Samples').StormEvents | take 10
and summarize the result and give me high level insights "
Ask your MCP client:
"Query the Samples database in the help cluster to show me the top 10 states by storm event count, include visualization"
Ask your MCP client:
"Discover and cache the schema for the help.kusto.windows.net cluster, then tell me what databases and tables are available"
Ask your MCP client:
"Using the StormEvents table in the Samples database on help cluster, show me all tornado events from 2007 with damage estimates over $1M"
Ask your MCP client:
"Analyze storm events by month for the year 2007 in the StormEvents table, group by event type and show as a visualization"
graph TD
A[MCP Client<br/>Claude/AI/Custom] <--> B[MCP KQL Server<br/>FastMCP Framework]
B <--> C[Azure Data Explorer<br/>Kusto Clusters]
B <--> D[Schema Memory<br/>Local AI Cache]
style A fill:#e1f5fe
style B fill:#f3e5f5
style C fill:#fff3e0
style D fill:#e8f5e8
mcp-kql-server/
āāā mcp_kql_server/
ā āāā __init__.py # Package initialization
ā āāā mcp_server.py # Main MCP server implementation
ā āāā execute_kql.py # KQL query execution logic
ā āāā schema_memory.py # Schema caching and discovery
ā āāā unified_memory.py # Advanced memory management
ā āāā kql_auth.py # Azure authentication
ā āāā utils.py # Utility functions
ā āāā constants.py # Configuration constants
āāā docs/ # Documentation
āāā Example/ # Usage examples
āāā pyproject.toml # Project configuration
āāā README.md # This file
{
"tool": "kql_execute",
"input": {
"query": "...",
"cluster_memory_path": "/custom/memory/path"
}
}
{
"tool": "kql_schema_memory",
"input": {
"cluster_uri": "mycluster",
"force_refresh": true
}
}
{
"tool": "kql_execute",
"input": {
"query": "...",
"use_schema_context": false, # Disable for faster execution
"visualize": false # Disable for minimal response
}
}
Authentication Errors
# Re-authenticate with Azure CLI
az login --tenant your-tenant-id
Memory Issues
# Clear schema cache if corrupted (automatic backup created)
# Windows:
del "%APPDATA%\KQL_MCP\schema_memory.json"
# macOS/Linux:
rm ~/.local/share/KQL_MCP/schema_memory.json
Connection Timeouts
Memory Path Issues
~/.kql_mcp_memory
if default path fails# Enable debug logging if needed
set KQL_DEBUG=true # Windows
export KQL_DEBUG=true # macOS/Linux
python -m mcp_kql_server
We welcome contributions! Please do.
Happy Querying! š
A comprehensive movie database server supporting advanced search, CRUD operations, and image management via a PostgreSQL database.
Provides read-only access to Apache Iceberg tables via Apache Impala, allowing LLMs to inspect schemas and execute queries.
A read-only MCP server for Pipedrive, enabling LLMs to query live data using the CData JDBC Driver.
Manage and query databases, tenants, users, auth using LLMs
MCP server acting as an interface to the Frankfurter API for currency exchange data.
Interact with Neon Postgres databases using natural language to manage projects, branches, queries, and migrations via the Neon API.
Interface with Biomart, a biological data query tool, using the pybiomart Python package.
A server for Retrieval Augmented Generation (RAG), providing AI clients access to a private knowledge base built from user documents.
Interact with Microsoft SQL Server (MSSQL) databases. List tables, read data, and execute SQL queries with controlled access.
Access real-time DEX analytics across 20+ blockchains with DexPaprika API, tracking 5M+ tokens, pools, volumes, and historical market data. Built by CoinPaprika.