Postgres MCP
Query any Postgres database using natural language.
PGMCP - PostgreSQL Model Context Protocol Server
PGMCP connects AI assistants to any PostgreSQL database through natural language queries. Ask questions in plain English and get structured SQL results with automatic streaming and robust error handling.
Works with: Cursor, Claude Desktop, VS Code extensions, and any MCP-compatible client
Quick Start
PGMCP connects to your existing PostgreSQL database and makes it accessible to AI assistants through natural language queries.
Prerequisites
- PostgreSQL database (existing database with your schema)
- OpenAI API key (optional, for AI-powered SQL generation)
Basic Usage
# Set up environment variables
export DATABASE_URL="postgres://user:password@localhost:5432/your-existing-db"
export OPENAI_API_KEY="your-api-key" # Optional
# Run server (using pre-compiled binary)
./pgmcp-server
# Test with client in another terminal
./pgmcp-client -ask "What tables do I have?" -format table
./pgmcp-client -ask "Who is the customer that has placed the most orders?" -format table
./pgmcp-client -search "john" -format table
Here is how it works:
π€ User / AI Assistant
β
β "Who are the top customers?"
βΌ
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Any MCP Client β
β β
β PGMCP CLI β Cursor β Claude Desktop β VS Code β ... β
β JSON/CSV β Chat β AI Assistant β Editor β β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β
β Streamable HTTP / MCP Protocol
βΌ
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β PGMCP Server β
β β
β π Security π§ AI Engine π Streaming β
β β’ Input Valid β’ Schema Cache β’ Auto-Pagination β
β β’ Audit Log β’ OpenAI API β’ Memory Management β
β β’ SQL Guard β’ Error Recovery β’ Connection Pool β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β
β Read-Only SQL Queries
βΌ
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Your PostgreSQL Database β
β β
β Any Schema: E-commerce, Analytics, CRM, etc. β
β Tables β’ Views β’ Indexes β’ Functions β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
External AI Services:
OpenAI API β’ Anthropic β’ Local LLMs (Ollama, etc.)
Key Benefits:
β
Works with ANY PostgreSQL database (no assumptions about schema)
β
No schema modifications required
β
Read-only access (100% safe)
β
Automatic streaming for large results
β
Intelligent query understanding (singular vs plural)
β
Robust error handling (graceful AI failure recovery)
β
PostgreSQL case sensitivity support (mixed-case tables)
β
Production-ready security and performance
β
Universal database compatibility
β
Multiple output formats (table, JSON, CSV)
β
Free-text search across all columns
β
Authentication support
β
Comprehensive testing suite
Features
- Natural Language to SQL: Ask questions in plain English
- Automatic Streaming: Handles large result sets automatically
- Safe Read-Only Access: Prevents any write operations
- Text Search: Search across all text columns
- Multiple Output Formats: Table, JSON, and CSV
- PostgreSQL Case Sensitivity: Handles mixed-case table names correctly
- Universal Compatibility: Works with any PostgreSQL database
Environment Variables
Required:
DATABASE_URL: PostgreSQL connection string to your existing database
Optional:
OPENAI_API_KEY: OpenAI API key for AI-powered SQL generationOPENAI_MODEL: Model to use (default: "gpt-4o-mini")HTTP_ADDR: Server address (default: ":8080")HTTP_PATH: MCP endpoint path (default: "/mcp")AUTH_BEARER: Bearer token for authentication
Installation
Download Pre-compiled Binaries
- Go to GitHub Releases
- Download the binary for your platform (Linux, macOS, Windows)
- Extract and run:
# Example for macOS/Linux
tar xzf pgmcp_*.tar.gz
cd pgmcp_*
./pgmcp-server
Alternative Options
# Homebrew (macOS/Linux) - Available after first release
brew tap subnetmarco/homebrew-tap
brew install pgmcp
# Build from source
go build -o pgmcp-server ./server
go build -o pgmcp-client ./client
Docker/Kubernetes
# Docker
docker run -e DATABASE_URL="postgres://user:pass@host:5432/db" \
-p 8080:8080 ghcr.io/subnetmarco/pgmcp:latest
# Kubernetes (see examples/ directory for full manifests)
kubectl create secret generic pgmcp-secret \
--from-literal=database-url="postgres://user:pass@host:5432/db"
kubectl apply -f examples/k8s/
Quick Start
# Set up database (optional - works with any existing PostgreSQL database)
export DATABASE_URL="postgres://user:password@localhost:5432/mydb"
psql $DATABASE_URL < schema.sql
# Run server
export OPENAI_API_KEY="your-api-key"
./pgmcp-server
# Test with client
./pgmcp-client -ask "Who is the user that places the most orders?" -format table
./pgmcp-client -ask "Show me the top 40 most reviewed items in the marketplace" -format table
Environment Variables
Required:
DATABASE_URL: PostgreSQL connection string
Optional:
OPENAI_API_KEY: OpenAI API key for SQL generationOPENAI_MODEL: Model to use (default: "gpt-4o-mini")HTTP_ADDR: Server address (default: ":8080")HTTP_PATH: MCP endpoint path (default: "/mcp")AUTH_BEARER: Bearer token for authentication
Usage Examples
# Ask questions in natural language
./pgmcp-client -ask "What are the top 5 customers?" -format table
./pgmcp-client -ask "How many orders were placed today?" -format json
# Search across all text fields
./pgmcp-client -search "john" -format table
# Multiple questions at once
./pgmcp-client -ask "Show tables" -ask "Count users" -format table
# Different output formats
./pgmcp-client -ask "Export all data" -format csv -max-rows 1000
Example Database
The project includes two schemas:
schema.sql: Full Amazon-like marketplace with 5,000+ recordsschema_minimal.sql: Minimal test schema with mixed-case"Categories"table
Key features:
- Mixed-case table names (
"Categories") for testing case sensitivity - Composite primary keys (
order_items) for testing AI assumptions - Realistic relationships and data types
Use your own database:
export DATABASE_URL="postgres://user:pass@host:5432/your_db"
./pgmcp-server
./pgmcp-client -ask "What tables do I have?"
AI Error Handling
When AI generates incorrect SQL, PGMCP handles it gracefully:
{
"error": "Column not found in generated query",
"suggestion": "Try rephrasing your question or ask about specific tables",
"original_sql": "SELECT non_existent_column FROM table..."
}
Instead of crashing, the system provides helpful feedback and continues operating.
MCP Integration
Cursor Integration
# Start server
export DATABASE_URL="postgres://user:pass@localhost:5432/your_db"
./pgmcp-server
Add to Cursor settings:
{
"mcp.servers": {
"pgmcp": {
"transport": {
"type": "http",
"url": "http://localhost:8080/mcp"
}
}
}
}
Claude Desktop Integration
Edit ~/.config/claude-desktop/claude_desktop_config.json:
{
"mcpServers": {
"pgmcp": {
"transport": {
"type": "http",
"url": "http://localhost:8080/mcp"
}
}
}
}
API Tools
ask: Natural language questions β SQL queries with automatic streamingsearch: Free-text search across all database text columnsstream: Advanced streaming for very large result sets with pagination
Safety Features
- Read-Only Enforcement: Blocks write operations (INSERT, UPDATE, DELETE, etc.)
- Query Timeouts: Prevents long-running queries
- Input Validation: Sanitizes and validates all user input
- Transaction Isolation: All queries run in read-only transactions
Testing
# Unit tests
go test ./server -v
# Integration tests (requires PostgreSQL)
go test ./server -tags=integration -v
License
Apache 2.0 - See LICENSE file for details.
Related Projects
- Model Context Protocol - The underlying protocol specification
- MCP Go SDK - Go implementation of MCP
PGMCP makes your PostgreSQL database accessible to AI assistants through natural language while maintaining security through read-only access controls.
Servidores relacionados
ODBC MCP Server
Enables LLM tools to query databases using ODBC connections.
Snowflake Stored Procedure Integration
Integrates and executes Snowflake stored procedures through an MCP server.
CData SAP Ariba Procurement Server
A read-only MCP server for querying live SAP Ariba Procurement data. Requires the CData JDBC Driver for SAP Ariba Procurement.
CData Adobe Analytics
A read-only MCP server to query live Adobe Analytics data. Requires the CData JDBC Driver for Adobe Analytics.
MCP Oracle Database Server
A server for full integration with Oracle Database. Requires Oracle Instant Client libraries.
Self-Hosted Supabase MCP Server
Interact with self-hosted Supabase instances for database management and introspection.
CData Tableau CRM Analytics Server
A read-only MCP server for querying live Tableau CRM Analytics data using the CData JDBC Driver.
Claude Conversation Memory System
Provides searchable local storage for Claude conversation history, enabling context retrieval during sessions.
Inertia
Inertia is a property of things that matter. Let Claude remember them across sessions.
Intacct MCP Server by CData
A read-only MCP server for Intacct, enabling LLMs to query live data using the CData JDBC Driver.