Postgres MCP
Query any Postgres database using natural language.
PGMCP - PostgreSQL Model Context Protocol Server
PGMCP connects AI assistants to any PostgreSQL database through natural language queries. Ask questions in plain English and get structured SQL results with automatic streaming and robust error handling.
Works with: Cursor, Claude Desktop, VS Code extensions, and any MCP-compatible client
Quick Start
PGMCP connects to your existing PostgreSQL database and makes it accessible to AI assistants through natural language queries.
Prerequisites
- PostgreSQL database (existing database with your schema)
- OpenAI API key (optional, for AI-powered SQL generation)
Basic Usage
# Set up environment variables
export DATABASE_URL="postgres://user:password@localhost:5432/your-existing-db"
export OPENAI_API_KEY="your-api-key" # Optional
# Run server (using pre-compiled binary)
./pgmcp-server
# Test with client in another terminal
./pgmcp-client -ask "What tables do I have?" -format table
./pgmcp-client -ask "Who is the customer that has placed the most orders?" -format table
./pgmcp-client -search "john" -format table
Here is how it works:
š¤ User / AI Assistant
ā
ā "Who are the top customers?"
ā¼
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
ā Any MCP Client ā
ā ā
ā PGMCP CLI ā Cursor ā Claude Desktop ā VS Code ā ... ā
ā JSON/CSV ā Chat ā AI Assistant ā Editor ā ā
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
ā
ā Streamable HTTP / MCP Protocol
ā¼
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
ā PGMCP Server ā
ā ā
ā š Security š§ AI Engine š Streaming ā
ā ⢠Input Valid ⢠Schema Cache ⢠Auto-Pagination ā
ā ⢠Audit Log ⢠OpenAI API ⢠Memory Management ā
ā ⢠SQL Guard ⢠Error Recovery ⢠Connection Pool ā
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
ā
ā Read-Only SQL Queries
ā¼
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
ā Your PostgreSQL Database ā
ā ā
ā Any Schema: E-commerce, Analytics, CRM, etc. ā
ā Tables ⢠Views ⢠Indexes ⢠Functions ā
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
External AI Services:
OpenAI API ⢠Anthropic ⢠Local LLMs (Ollama, etc.)
Key Benefits:
ā
Works with ANY PostgreSQL database (no assumptions about schema)
ā
No schema modifications required
ā
Read-only access (100% safe)
ā
Automatic streaming for large results
ā
Intelligent query understanding (singular vs plural)
ā
Robust error handling (graceful AI failure recovery)
ā
PostgreSQL case sensitivity support (mixed-case tables)
ā
Production-ready security and performance
ā
Universal database compatibility
ā
Multiple output formats (table, JSON, CSV)
ā
Free-text search across all columns
ā
Authentication support
ā
Comprehensive testing suite
Features
- Natural Language to SQL: Ask questions in plain English
- Automatic Streaming: Handles large result sets automatically
- Safe Read-Only Access: Prevents any write operations
- Text Search: Search across all text columns
- Multiple Output Formats: Table, JSON, and CSV
- PostgreSQL Case Sensitivity: Handles mixed-case table names correctly
- Universal Compatibility: Works with any PostgreSQL database
Environment Variables
Required:
DATABASE_URL
: PostgreSQL connection string to your existing database
Optional:
OPENAI_API_KEY
: OpenAI API key for AI-powered SQL generationOPENAI_MODEL
: Model to use (default: "gpt-4o-mini")HTTP_ADDR
: Server address (default: ":8080")HTTP_PATH
: MCP endpoint path (default: "/mcp")AUTH_BEARER
: Bearer token for authentication
Installation
Download Pre-compiled Binaries
- Go to GitHub Releases
- Download the binary for your platform (Linux, macOS, Windows)
- Extract and run:
# Example for macOS/Linux
tar xzf pgmcp_*.tar.gz
cd pgmcp_*
./pgmcp-server
Alternative Options
# Homebrew (macOS/Linux) - Available after first release
brew tap subnetmarco/homebrew-tap
brew install pgmcp
# Build from source
go build -o pgmcp-server ./server
go build -o pgmcp-client ./client
Docker/Kubernetes
# Docker
docker run -e DATABASE_URL="postgres://user:pass@host:5432/db" \
-p 8080:8080 ghcr.io/subnetmarco/pgmcp:latest
# Kubernetes (see examples/ directory for full manifests)
kubectl create secret generic pgmcp-secret \
--from-literal=database-url="postgres://user:pass@host:5432/db"
kubectl apply -f examples/k8s/
Quick Start
# Set up database (optional - works with any existing PostgreSQL database)
export DATABASE_URL="postgres://user:password@localhost:5432/mydb"
psql $DATABASE_URL < schema.sql
# Run server
export OPENAI_API_KEY="your-api-key"
./pgmcp-server
# Test with client
./pgmcp-client -ask "Who is the user that places the most orders?" -format table
./pgmcp-client -ask "Show me the top 40 most reviewed items in the marketplace" -format table
Environment Variables
Required:
DATABASE_URL
: PostgreSQL connection string
Optional:
OPENAI_API_KEY
: OpenAI API key for SQL generationOPENAI_MODEL
: Model to use (default: "gpt-4o-mini")HTTP_ADDR
: Server address (default: ":8080")HTTP_PATH
: MCP endpoint path (default: "/mcp")AUTH_BEARER
: Bearer token for authentication
Usage Examples
# Ask questions in natural language
./pgmcp-client -ask "What are the top 5 customers?" -format table
./pgmcp-client -ask "How many orders were placed today?" -format json
# Search across all text fields
./pgmcp-client -search "john" -format table
# Multiple questions at once
./pgmcp-client -ask "Show tables" -ask "Count users" -format table
# Different output formats
./pgmcp-client -ask "Export all data" -format csv -max-rows 1000
Example Database
The project includes two schemas:
schema.sql
: Full Amazon-like marketplace with 5,000+ recordsschema_minimal.sql
: Minimal test schema with mixed-case"Categories"
table
Key features:
- Mixed-case table names (
"Categories"
) for testing case sensitivity - Composite primary keys (
order_items
) for testing AI assumptions - Realistic relationships and data types
Use your own database:
export DATABASE_URL="postgres://user:pass@host:5432/your_db"
./pgmcp-server
./pgmcp-client -ask "What tables do I have?"
AI Error Handling
When AI generates incorrect SQL, PGMCP handles it gracefully:
{
"error": "Column not found in generated query",
"suggestion": "Try rephrasing your question or ask about specific tables",
"original_sql": "SELECT non_existent_column FROM table..."
}
Instead of crashing, the system provides helpful feedback and continues operating.
MCP Integration
Cursor Integration
# Start server
export DATABASE_URL="postgres://user:pass@localhost:5432/your_db"
./pgmcp-server
Add to Cursor settings:
{
"mcp.servers": {
"pgmcp": {
"transport": {
"type": "http",
"url": "http://localhost:8080/mcp"
}
}
}
}
Claude Desktop Integration
Edit ~/.config/claude-desktop/claude_desktop_config.json
:
{
"mcpServers": {
"pgmcp": {
"transport": {
"type": "http",
"url": "http://localhost:8080/mcp"
}
}
}
}
API Tools
ask
: Natural language questions ā SQL queries with automatic streamingsearch
: Free-text search across all database text columnsstream
: Advanced streaming for very large result sets with pagination
Safety Features
- Read-Only Enforcement: Blocks write operations (INSERT, UPDATE, DELETE, etc.)
- Query Timeouts: Prevents long-running queries
- Input Validation: Sanitizes and validates all user input
- Transaction Isolation: All queries run in read-only transactions
Testing
# Unit tests
go test ./server -v
# Integration tests (requires PostgreSQL)
go test ./server -tags=integration -v
License
Apache 2.0 - See LICENSE file for details.
Related Projects
- Model Context Protocol - The underlying protocol specification
- MCP Go SDK - Go implementation of MCP
PGMCP makes your PostgreSQL database accessible to AI assistants through natural language while maintaining security through read-only access controls.
Related Servers
Claude Conversation Memory System
Provides searchable local storage for Claude conversation history, enabling context retrieval during sessions.
Memory Custom : PouchDB
Extends the Memory server with PouchDB for robust document-based storage, custom memory file paths, and interaction timestamping.
Pinot MCP Server
An MCP server for interacting with Apache Pinot, a real-time distributed OLAP datastore.
Dremio
Integrate Large Language Models (LLMs) with the Dremio data lakehouse platform.
A-Share MCP
Provides A-share stock market data, including basic info, historical K-lines, financial indicators, and macroeconomic data using Baostock.
SQL-Transpiler MCP Tool
Transpile SQL queries between different dialects using the sqlglot library.
MySQL Schema MCP Server
A server for retrieving MySQL database schema information using the FastMCP framework.
supOS MCP Server
Provides access to supOS open APIs for querying topic structures, real-time and historical data, and executing SQL queries.
PostgreSQL MCP Server
An MCP server that provides tools to interact with PostgreSQL databases.
Metabase MCP Server
Integrates AI assistants with the Metabase business intelligence and analytics platform.