bricks and context

Production-grade MCP server for Databricks: SQL Warehouses, Jobs API, multi-workspace support.

๐Ÿงฑ Bricks and Context

Production-grade Model Context Protocol (MCP) server for Databricks

CI Python 3.10+ License: MIT MCP

SQL Warehouses ยท Jobs API ยท Multi-Workspace ยท Built for AI Agents


โœจ What is this?

Bricks and Context lets AI assistants (Cursor, Claude Desktop, etc.) talk directly to your Databricks workspaces through the Model Context Protocol.

Think of it as a bridge: your AI asks questions, this server translates them into Databricks API calls, and returns structured, AI-friendly responses.

Why use this?

Pain PointHow we solve it
AI gets overwhelmed by huge query resultsBounded outputs โ€” configurable row/byte/cell limits
Flaky connections cause random failuresRetries + circuit breakers โ€” automatic fault tolerance
Managing multiple environments is tediousMulti-workspace โ€” switch between dev/prod with one parameter
Raw API responses confuse AI modelsMarkdown tables โ€” structured, LLM-optimized output

๐Ÿ”ง Available Tools

ToolWhat it does
execute_sql_queryRun SQL with bounded, AI-safe output
discover_schemasList all schemas in the workspace
discover_tablesList tables in a schema with metadata
describe_tableGet column types, nullability, structure
get_table_samplePreview rows for data exploration
connection_healthVerify Databricks connectivity
ToolWhat it does
list_jobsList jobs with optional name filtering
get_job_detailsFull job config: schedule, cluster, tasks
get_job_runsRun history with state and duration
trigger_jobStart a job with optional parameters
cancel_job_runStop a running job
get_job_run_outputRetrieve logs, errors, notebook output
ToolWhat it does
cache_statsHit rates, memory usage, category breakdown
performance_statsOperation latencies, error rates, health

๐Ÿš€ Quick Start

1. Clone & Install

git clone https://github.com/laraib-sidd/bricks-and-context.git
cd bricks-and-context
uv sync  # or: pip install -e .

2. Configure Workspaces

Copy the template and add your credentials:

cp auth.template.yaml auth.yaml

Edit auth.yaml:

default_workspace: dev

workspaces:
  - name: dev
    host: your-dev.cloud.databricks.com
    token: dapi...
    http_path: /sql/1.0/warehouses/...

  - name: prod
    host: your-prod.cloud.databricks.com
    token: dapi...
    http_path: /sql/1.0/warehouses/...

๐Ÿ’ก auth.yaml is gitignored. Your secrets stay local.

3. Run

python run_mcp_server.py

๐ŸŽฏ Cursor Integration

Cursor uses stdio transport and doesn't inherit your shell environment. You need explicit paths.

Step 1: Ensure dependencies are installed

cd /path/to/bricks-and-context
uv sync

Step 2: Open MCP settings in Cursor

Cmd+Shift+P โ†’ "Open MCP Settings" โ†’ Opens ~/.cursor/mcp.json

Step 3: Add this configuration

Using uv run (recommended):

{
  "mcpServers": {
    "databricks": {
      "command": "uv",
      "args": [
        "--directory", "/path/to/bricks-and-context",
        "run", "python", "run_mcp_server.py"
      ],
      "env": {
        "MCP_AUTH_PATH": "/path/to/bricks-and-context/auth.yaml",
        "MCP_CONFIG_PATH": "/path/to/bricks-and-context/config.json"
      }
    }
  }
}

Or using venv directly:

{
  "mcpServers": {
    "databricks": {
      "command": "/path/to/bricks-and-context/.venv/bin/python",
      "args": ["/path/to/bricks-and-context/run_mcp_server.py"],
      "env": {
        "MCP_AUTH_PATH": "/path/to/bricks-and-context/auth.yaml",
        "MCP_CONFIG_PATH": "/path/to/bricks-and-context/config.json"
      }
    }
  }
}

Step 4: Restart Cursor

Reload the window to activate the MCP server.

Test it

Ask your AI:

  • "List my Databricks jobs"
  • "Run SELECT 1 on Databricks"
  • "Describe the table catalog.schema.my_table"

๐ŸŒ Multi-Workspace

Define multiple workspaces in auth.yaml, then select per-call:

execute_sql_query(sql="SELECT 1", workspace="prod")
list_jobs(limit=10, workspace="dev")

When workspace is omitted, the server uses default_workspace.


โš™๏ธ Configuration

config.json โ€” Tunable settings (committed)

SettingDefaultDescription
max_connections10Connection pool size
max_result_rows200Max rows returned per query
max_result_bytes262144Max response size (256KB)
max_cell_chars200Truncate long cell values
allow_write_queriesfalseEnable INSERT/UPDATE/DELETE
enable_sql_retriestrueRetry transient SQL failures
enable_query_cachefalseCache repeated queries
query_cache_ttl_seconds300Cache TTL
databricks_api_timeout_seconds30Jobs API timeout

Any setting can be overridden via environment variable (uppercase, e.g., MAX_RESULT_ROWS=500).


๐Ÿ—๏ธ Architecture

โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚                   MCP Client (Cursor / Claude)                  โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
                                โ”‚ stdio
                                โ–ผ
โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚                     FastMCP Server                              โ”‚
โ”‚  โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”  โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”  โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”  โ”‚
โ”‚  โ”‚ SQL Tools   โ”‚  โ”‚ Job Tools   โ”‚  โ”‚ Observability           โ”‚  โ”‚
โ”‚  โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”˜  โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”˜  โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜  โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
          โ”‚                โ”‚                     โ”‚
          โ–ผ                โ–ผ                     โ–ผ
โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚ Connection Pool  โ”‚ โ”‚  Job Manager     โ”‚ โ”‚ Cache / Perf Monitor โ”‚
โ”‚  (SQL Connector) โ”‚ โ”‚  (REST API 2.1)  โ”‚ โ”‚                      โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
         โ”‚                    โ”‚
         โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
                  โ–ผ
โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚                   Databricks Workspace(s)                       โ”‚
โ”‚              SQL Warehouse        Jobs Service                  โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜

๐Ÿ›ก๏ธ Reliability Features

FeatureDescription
Bounded outputsRows, bytes, and cell-character limits prevent OOM
Connection poolingThread-safe with per-connection health validation
Retry with backoffExponential backoff + jitter for transient failures
Circuit breakersAutomatic fault isolation, prevents cascading failures
Query cachingOptional TTL-based caching for repeated queries

๐Ÿง‘โ€๐Ÿ’ป Development

uv sync --dev        # Install dev dependencies
uv run pytest        # Run tests
uv run black .       # Format code
uv run mypy src/     # Type check

๐Ÿ“„ License

MIT โ€” see LICENSE

Related Servers