Interact with any Gremlin-compatible graph database using natural language, with support for schema discovery, complex queries, and data import/export.
Connect AI agents like Claude, Cursor, and Windsurf to your graph databases!
An MCP (Model Context Protocol) server that enables AI assistants to interact with any Gremlin-compatible graph database through natural language. Query your data, discover schemas, analyze relationships, and manage graph data using simple conversations.
Talk to your graph database naturally:
Your AI assistant gets access to these powerful tools:
Tool | Purpose | What It Does |
---|---|---|
š get_graph_status | Health Check | Verify database connectivity and server status |
š get_graph_schema | Schema Discovery | Get complete graph structure with nodes, edges, and relationships |
ā” run_gremlin_query | Query Execution | Execute any Gremlin traversal query with full syntax support |
š refresh_schema_cache | Cache Management | Force immediate refresh of cached schema information |
š„ import_graph_data | Data Import | Load data from GraphSON, CSV, or JSON with batch processing |
š¤ export_subgraph | Data Export | Extract subgraphs to JSON, GraphSON, or CSV formats |
# The npx command will automatically install the package if needed
# No separate installation step required
# Clone and setup
git clone https://github.com/kpritam/gremlin-mcp.git
cd gremlin-mcp
npm install
npm run build
Add this to your MCP client configuration:
Using the published package (recommended):
{
"mcpServers": {
"gremlin": {
"command": "npx",
"args": ["@kpritam/gremlin-mcp"],
"env": {
"GREMLIN_ENDPOINT": "localhost:8182",
"LOG_LEVEL": "info"
}
}
}
}
From source:
{
"mcpServers": {
"gremlin": {
"command": "node",
"args": ["/path/to/gremlin-mcp/dist/server.js"],
"env": {
"GREMLIN_ENDPOINT": "localhost:8182",
"LOG_LEVEL": "info"
}
}
}
}
{
"mcpServers": {
"gremlin": {
"command": "npx",
"args": ["@kpritam/gremlin-mcp"],
"env": {
"GREMLIN_ENDPOINT": "your-server.com:8182",
"GREMLIN_USERNAME": "your-username",
"GREMLIN_PASSWORD": "your-password",
"GREMLIN_USE_SSL": "true"
}
}
}
}
Make sure your Gremlin-compatible database is running:
# For Apache TinkerPop Gremlin Server
./bin/gremlin-server.sh start
# Or using Docker
docker run -p 8182:8182 tinkerpop/gremlin-server
Restart your AI client and try asking:
"Can you check if my graph database is connected and show me its schema?"
You ask: "What's the structure of my graph database?"
AI response: The AI calls get_graph_schema
and tells you about your node types, edge types, and how they're connected.
You ask: "Show me all people over 30 and their relationships"
AI response: The AI executes g.V().hasLabel('person').has('age', gt(30)).out().path()
and explains the results in natural language.
You ask: "Give me some statistics about my graph"
AI response: The AI runs multiple queries to count nodes, edges, and analyze the distribution, then presents a summary.
You ask: "Load this GraphSON data into my database"
AI response: The AI uses import_graph_data
to process your data in batches and reports the import status.
Why this matters: AI agents work best when they know the exact valid values for properties. Instead of guessing or making invalid queries, they can use precise, real values from your data.
One of the most powerful features of this MCP server is Automatic Enum Discovery - it intelligently analyzes your graph data to discover valid property values and provides them as enums to AI agents.
Without Enum Discovery:
AI: "I see this vertex has a 'status' property of type 'string'...
Let me try querying with status='active'"
Result: ā No results (actual values are 'CONFIRMED', 'PENDING', 'CANCELLED')
With Enum Discovery:
AI: "I can see the 'status' property has these exact values:
['CONFIRMED', 'PENDING', 'CANCELLED', 'WAITLISTED']
Let me query with status='CONFIRMED'"
Result: ā
Perfect results using real data values
The server automatically scans your graph properties and:
Example Output:
{
"name": "bookingStatus",
"type": ["string"],
"cardinality": "single",
"enum": ["CONFIRMED", "PENDING", "CANCELLED", "WAITLISTED"],
"sample_values": ["CONFIRMED", "PENDING"]
}
Fine-tune enum discovery to match your data:
# Enable/disable enum discovery
GREMLIN_ENUM_DISCOVERY_ENABLED="true" # Default: true
# Control what gets detected as enum
GREMLIN_ENUM_CARDINALITY_THRESHOLD="10" # Max distinct values for enum (default: 10)
# Exclude specific properties
GREMLIN_ENUM_PROPERTY_BLACKLIST="id,uuid,timestamp,createdAt,updatedAt"
# Schema optimization
GREMLIN_SCHEMA_MAX_ENUM_VALUES="10" # Limit enum values shown (default: 10)
GREMLIN_SCHEMA_INCLUDE_SAMPLE_VALUES="false" # Reduce schema size (default: false)
Some properties should never be treated as enums:
Automatically Excluded:
Manual Exclusion:
# Exclude specific properties by name
GREMLIN_ENUM_PROPERTY_BLACKLIST="userId,sessionId,description,notes,content"
Common Blacklist Patterns:
id,uuid,guid
- Unique identifierstimestamp,createdAt,updatedAt,lastModified
- Time fieldsdescription,notes,comment,content,text
- Free text fieldsemail,url,phone,address
- Personal/contact datahash,token,key,secret
- Security-related fieldsE-commerce Graph:
{
"orderStatus": {
"enum": ["PENDING", "PROCESSING", "SHIPPED", "DELIVERED", "CANCELLED"]
},
"productCategory": {
"enum": ["ELECTRONICS", "CLOTHING", "BOOKS", "HOME", "SPORTS"]
},
"paymentMethod": {
"enum": ["CREDIT_CARD", "PAYPAL", "BANK_TRANSFER", "CRYPTO"]
}
}
Social Network Graph:
{
"relationshipType": {
"enum": ["FRIEND", "FAMILY", "COLLEAGUE", "ACQUAINTANCE"]
},
"privacyLevel": {
"enum": ["PUBLIC", "FRIENDS", "PRIVATE"]
},
"accountStatus": {
"enum": ["ACTIVE", "SUSPENDED", "DEACTIVATED"]
}
}
For Large Datasets:
GREMLIN_ENUM_CARDINALITY_THRESHOLD="5" # Stricter enum detection
GREMLIN_SCHEMA_MAX_ENUM_VALUES="5" # Fewer values in schema
For Rich Categorical Data:
GREMLIN_ENUM_CARDINALITY_THRESHOLD="25" # More permissive detection
GREMLIN_SCHEMA_MAX_ENUM_VALUES="20" # Show more enum values
For Performance-Critical Environments:
GREMLIN_ENUM_DISCOVERY_ENABLED="false" # Disable for faster schema loading
GREMLIN_SCHEMA_INCLUDE_SAMPLE_VALUES="false" # Minimal schema size
This intelligent enum discovery transforms how AI agents interact with your graph data, making queries more accurate and insights more meaningful! šÆ
Works with any Gremlin-compatible graph database:
Database | Status | Notes |
---|---|---|
š¢ Apache TinkerPop | ā Tested | Local development and CI testing |
š” Amazon Neptune | š§ Compatible | Designed for, not yet tested |
š” JanusGraph | š§ Compatible | Designed for, not yet tested |
š” Azure Cosmos DB | š§ Compatible | With Gremlin API |
š” ArcadeDB | š§ Compatible | With Gremlin support |
# Required
GREMLIN_ENDPOINT="localhost:8182"
# Optional
GREMLIN_USE_SSL="true" # Enable SSL/TLS
GREMLIN_USERNAME="username" # Authentication
GREMLIN_PASSWORD="password" # Authentication
GREMLIN_IDLE_TIMEOUT="300" # Connection timeout (seconds)
LOG_LEVEL="info" # Logging level
# Schema and performance tuning (see Automatic Enum Discovery section for details)
GREMLIN_ENUM_DISCOVERY_ENABLED="true" # Enable smart enum detection
GREMLIN_ENUM_CARDINALITY_THRESHOLD="10" # Max distinct values for enum
GREMLIN_ENUM_PROPERTY_BLACKLIST="id,timestamp" # Exclude specific properties
GREMLIN_SCHEMA_INCLUDE_SAMPLE_VALUES="false" # Reduce schema size
GREMLIN_SCHEMA_MAX_ENUM_VALUES="10" # Limit enum values shown
ā ļø Important: This server is designed for development and trusted environments.
Problem | Solution |
---|---|
"Connection refused" | Verify Gremlin server is running: curl http://localhost:8182/ |
"Authentication failed" | Check GREMLIN_USERNAME and GREMLIN_PASSWORD |
"Invalid endpoint" | Use format host:port or host:port/g for traversal source |
GREMLIN_IDLE_TIMEOUT
# Test connection
curl -f http://localhost:8182/
# Check server logs
tail -f logs/gremlin-mcp.log
# Verify schema endpoint
curl http://localhost:8182/gremlin
The following sections are for developers who want to contribute to or modify the server.
# Clone and install
git clone https://github.com/kpritam/gremlin-mcp.git
cd gremlin-mcp
npm install
# Development with hot reload
npm run dev
# Run tests
npm test
npm run test:coverage
npm run test:watch
# Integration tests (requires running Gremlin server)
GREMLIN_ENDPOINT=localhost:8182/g npm run test:it
# All tests together (unit + integration)
npm test && npm run test:it
src/
āāā server.ts # Main MCP server
āāā config.ts # Environment configuration
āāā gremlin/
ā āāā client.ts # Gremlin database client
ā āāā models.ts # TypeScript types and schemas
āāā handlers/
ā āāā tools.ts # MCP tool implementations
ā āāā resources.ts # MCP resource handlers
āāā utils/ # Utility functions
Command | Purpose |
---|---|
npm run build | Compile TypeScript to JavaScript |
npm run dev | Development mode with hot reload |
npm test | Run unit test suite |
npm run lint | Code linting with ESLint |
npm run format | Code formatting with Prettier |
npm run validate | Run all checks (format, lint, type-check, test) |
The server implements intelligent schema discovery with enumeration detection:
// Property with detected enum values
{
"name": "status",
"type": ["string"],
"cardinality": "single",
"enum": ["Confirmed", "Pending", "Cancelled", "Waitlisted"]
}
RULES.md
npm run validate
before committingtests/
): Individual component testing
tests/integration/
): Full workflow testing
MIT License - feel free to use in your projects!
Questions? Check the troubleshooting guide or open an issue.
MCP server for dbt-core (OSS) users as the official dbt MCP only supports dbt Cloud. Supports project metadata, model and column-level lineage and dbt documentation.
Read and write access to Airtable databases.
A Model Context Protocol Server for MongoDB
Open source MCP server specializing in easy, fast, and secure tools for Databases.
Access the OSV (Open Source Vulnerabilities) database for vulnerability information. Query vulnerabilities by package version or commit, batch query multiple packages, and get detailed vulnerability information by ID.
Provides AI assistants with a secure and structured way to explore and analyze data in GreptimeDB.
Neo4j graph database server (schema + read/write-cypher) and separate graph database backed memory
MCP server acting as an interface to the Frankfurter API for currency exchange data.
Create, manage, and update applications on InstantDB, the modern Firebase.
Enables persistent knowledge storage for Claude using a knowledge graph with multiple database backends like PostgreSQL and SQLite.