Jira MCP Server
An MCP server for accessing JIRA issue data stored in Snowflake.
Jira MCP Server
A Model Context Protocol (MCP) server that provides access to JIRA issue data stored in Snowflake. This server enables AI assistants to query, filter, and analyze JIRA issues through a standardized interface.
Overview
This MCP server connects to Snowflake to query JIRA data and provides four main tools for interacting with the data:
list_jira_issues
- Query and filter JIRA issues with various criteriaget_jira_issue_details
- Get detailed information for a specific issue by keyget_jira_project_summary
- Get statistics and summaries for all projectslist_jira_components
- List and filter JIRA components with various criteria
Features
Data Sources
The server connects to Snowflake and queries the following tables:
JIRA_ISSUE_NON_PII
- Main issue data (non-personally identifiable information)JIRA_LABEL_RHAI
- Issue labels and tagsJIRA_COMMENT_NON_PII
- Issue comments (non-personally identifiable information)JIRA_COMPONENT_RHAI
- JIRA project components and their metadata
Note: Table names are expected to exist in your configured Snowflake database and schema.
Available Tools
1. List Issues (list_jira_issues
)
Query JIRA issues with optional filtering:
- Project filtering - Filter by project key (e.g., 'SMQE', 'OSIM')
- Issue type filtering - Filter by issue type ID
- Status filtering - Filter by issue status ID
- Priority filtering - Filter by priority ID
- Text search - Search in summary and description fields
- Result limiting - Control number of results returned (default: 50)
2. Get Issue Details (get_jira_issue_details
)
Retrieve comprehensive information for a specific JIRA issue by its key (e.g., 'SMQE-1280'), including:
- Basic issue information (summary, description, status, priority)
- Timestamps (created, updated, due date, resolution date)
- Time tracking (original estimate, current estimate, time spent)
- Metadata (votes, watches, environment, components)
- Associated labels
- Comments (with comment body, creation/update timestamps, and role level)
3. Get Project Summary (get_jira_project_summary
)
Generate statistics across all projects:
- Total issue counts per project
- Status distribution per project
- Priority distribution per project
- Overall statistics
4. List Components (list_jira_components
)
Query and filter JIRA components with optional criteria:
- Project filtering - Filter by project ID (e.g., '12325621')
- Archived filtering - Filter by archived status ('Y' or 'N')
- Deleted filtering - Filter by deleted status ('Y' or 'N')
- Text search - Search in component name and description fields
- Result limiting - Control number of results returned (default: 50)
Returns component information including:
- Component ID, project, name, and description
- Component URL and assignee details
- Archived and deleted status
- Lead information and assignee type
- Last sync timestamp
Monitoring & Metrics
The server includes optional Prometheus metrics support for monitoring:
- Tool usage tracking - Track calls to each MCP tool with success/error rates and duration
- Snowflake query monitoring - Monitor database query performance and success rates
- Connection tracking - Track active MCP connections
- HTTP endpoints -
/metrics
for Prometheus scraping and/health
for health checks
Prerequisites
- Python 3.10+
- UV (Python package manager)
- Podman or Docker
- Access to Snowflake with appropriate credentials
Architecture
The codebase is organized into modular components in the src/
directory:
src/mcp_server.py
- Main server entry point and MCP initializationsrc/config.py
- Configuration management and environment variable handlingsrc/database.py
- Snowflake database connection and query executionsrc/tools.py
- MCP tool implementations and business logicsrc/metrics.py
- Optional Prometheus metrics collection and HTTP server
Environment Variables
The following environment variables are used to configure the Snowflake connection:
Connection Method
SNOWFLAKE_CONNECTION_METHOD
- Connection method to use- Values:
api
(REST API) orconnector
(snowflake-connector-python) - Default:
api
- Values:
REST API Method (Default)
When using SNOWFLAKE_CONNECTION_METHOD=api
:
Required
SNOWFLAKE_TOKEN
- Your Snowflake authentication token (Bearer token)SNOWFLAKE_BASE_URL
- Snowflake API base URL (e.g.,https://your-account.snowflakecomputing.com/api/v2
)SNOWFLAKE_DATABASE
- Snowflake database name containing your JIRA dataSNOWFLAKE_SCHEMA
- Snowflake schema name containing your JIRA tables
Connector Method (Service Account Support)
When using SNOWFLAKE_CONNECTION_METHOD=connector
:
Required for All Methods
SNOWFLAKE_ACCOUNT
- Snowflake account identifier (e.g.,your-account.snowflakecomputing.com
)SNOWFLAKE_DATABASE
- Snowflake database name containing your JIRA dataSNOWFLAKE_SCHEMA
- Snowflake schema name containing your JIRA tablesSNOWFLAKE_WAREHOUSE
- Snowflake warehouse name
Authentication Methods
Private Key Authentication (Recommended for Service Accounts)
SNOWFLAKE_AUTHENTICATOR
- Set tosnowflake_jwt
SNOWFLAKE_USER
- Snowflake username that has the public key registeredSNOWFLAKE_PRIVATE_KEY_FILE
- Path to private key file (PKCS#8 format)SNOWFLAKE_PRIVATE_KEY_FILE_PWD
- Private key password (optional, if key is encrypted)
Username/Password Authentication
SNOWFLAKE_AUTHENTICATOR
- Set tosnowflake
(default)SNOWFLAKE_USER
- Snowflake usernameSNOWFLAKE_PASSWORD
- Snowflake password
OAuth Client Credentials
SNOWFLAKE_AUTHENTICATOR
- Set tooauth_client_credentials
SNOWFLAKE_OAUTH_CLIENT_ID
- OAuth client IDSNOWFLAKE_OAUTH_CLIENT_SECRET
- OAuth client secretSNOWFLAKE_OAUTH_TOKEN_URL
- OAuth token URL (optional)
OAuth Token
SNOWFLAKE_AUTHENTICATOR
- Set tooauth
SNOWFLAKE_TOKEN
- OAuth access token
Optional
SNOWFLAKE_ROLE
- Snowflake role to use (optional)
General Configuration
MCP_TRANSPORT
- Transport protocol for MCP communication- Default:
stdio
- Default:
ENABLE_METRICS
- Enable Prometheus metrics collection- Default:
false
- Default:
METRICS_PORT
- Port for metrics HTTP server- Default:
8000
- Default:
Private Key Setup Example
To set up private key authentication:
-
Generate RSA key pair:
# Generate private key openssl genrsa 2048 | openssl pkcs8 -topk8 -inform PEM -out rsa_key.p8 # Generate public key openssl rsa -in rsa_key.p8 -pubout -out rsa_key.pub
-
Register public key with Snowflake user:
ALTER USER your_service_account SET RSA_PUBLIC_KEY='MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEA...';
-
Set environment variables:
export SNOWFLAKE_CONNECTION_METHOD=connector export SNOWFLAKE_AUTHENTICATOR=snowflake_jwt export SNOWFLAKE_ACCOUNT=your-account.snowflakecomputing.com export SNOWFLAKE_USER=your_service_account export SNOWFLAKE_PRIVATE_KEY_FILE=/path/to/rsa_key.p8 export SNOWFLAKE_DATABASE=your_database export SNOWFLAKE_SCHEMA=your_schema export SNOWFLAKE_WAREHOUSE=your_warehouse export SNOWFLAKE_ROLE=your_role
Installation & Setup
Migration from pip to UV
This project has been updated to use UV for dependency management. If you have an existing setup:
-
Remove your old virtual environment:
rm -rf venv/
-
Install UV if you haven't already (see Local Development section below)
-
Install dependencies with UV:
uv sync
Local Development
- Clone the repository:
git clone <repository-url>
cd jira-mcp-snowflake
- Install UV if you haven't already:
# On macOS/Linux
curl -LsSf https://astral.sh/uv/install.sh | sh
# On Windows
powershell -c "irm https://astral.sh/uv/install.ps1 | iex"
# Or via pip
pip install uv
- Install dependencies:
uv sync
-
Set up environment variables (see Environment Variables section above)
-
Run the server:
uv run python src/mcp_server.py
Using Makefile Targets
For convenience, several Makefile targets are available to streamline development tasks:
Development Setup
# Install dependencies including dev packages
make uv_sync_dev
Testing and Quality Assurance
# Run linting (flake8)
make lint
# Run tests with coverage
make pytest
# Run both linting and tests
make test
Building
# Build container image with Podman
make build
Note: On macOS, you may need to install a newer version of make via Homebrew:
brew install make
Container Deployment
Building locally
To build the container image locally using Podman, run:
podman build -t localhost/jira-mcp-snowflake:latest .
This will create a local image named jira-mcp-snowflake:latest
that you can use to run the server. The container now uses UV for fast dependency management.
Running with Podman or Docker
Example 1: REST API with Token
{
"mcpServers": {
"jira-mcp-snowflake": {
"command": "podman",
"args": [
"run",
"-i",
"--rm",
"-e", "SNOWFLAKE_CONNECTION_METHOD=api",
"-e", "SNOWFLAKE_TOKEN=your_token_here",
"-e", "SNOWFLAKE_BASE_URL=https://your-account.snowflakecomputing.com/api/v2",
"-e", "SNOWFLAKE_DATABASE=your_database_name",
"-e", "SNOWFLAKE_SCHEMA=your_schema_name",
"-e", "MCP_TRANSPORT=stdio",
"-e", "ENABLE_METRICS=true",
"-e", "METRICS_PORT=8000",
"localhost/jira-mcp-snowflake:latest"
]
}
}
}
Example 2: Private Key Authentication (Service Account)
{
"mcpServers": {
"jira-mcp-snowflake": {
"command": "podman",
"args": [
"run",
"-i",
"--rm",
"-v", "/path/to/your/rsa_key.p8:/app/rsa_key.p8:ro",
"-e", "SNOWFLAKE_CONNECTION_METHOD=connector",
"-e", "SNOWFLAKE_AUTHENTICATOR=snowflake_jwt",
"-e", "SNOWFLAKE_ACCOUNT=your-account.snowflakecomputing.com",
"-e", "SNOWFLAKE_USER=your_service_account",
"-e", "SNOWFLAKE_PRIVATE_KEY_FILE=/app/rsa_key.p8",
"-e", "SNOWFLAKE_DATABASE=your_database_name",
"-e", "SNOWFLAKE_SCHEMA=your_schema_name",
"-e", "SNOWFLAKE_WAREHOUSE=your_warehouse_name",
"-e", "SNOWFLAKE_ROLE=your_role_name",
"-e", "MCP_TRANSPORT=stdio",
"-e", "ENABLE_METRICS=true",
"-e", "METRICS_PORT=8000",
"localhost/jira-mcp-snowflake:latest"
]
}
}
}
Then access metrics at: http://localhost:8000/metrics
Connecting to a remote instance
Example configuration for connecting to a remote instance:
{
"mcpServers": {
"jira-mcp-snowflake": {
"url": "https://jira-mcp-snowflake.example.com/sse",
"headers": {
"X-Snowflake-Token": "your_token_here"
}
}
}
}
VS Code Continue Integration
Example configuration to add to VS Code Continue:
{
"experimental": {
"modelContextProtocolServers": [
{
"name": "jira-mcp-snowflake",
"transport": {
"type": "stdio",
"command": "podman",
"args": [
"run",
"-i",
"--rm",
"-e", "SNOWFLAKE_TOKEN=your_token_here",
"-e", "SNOWFLAKE_BASE_URL=https://your-account.snowflakecomputing.com/api/v2",
"-e", "SNOWFLAKE_DATABASE=your_database_name",
"-e", "SNOWFLAKE_SCHEMA=your_schema_name",
"-e", "MCP_TRANSPORT=stdio",
"-e", "ENABLE_METRICS=true",
"-e", "METRICS_PORT=8000",
"localhost/jira-mcp-snowflake:latest"
]
}
}
]
}
}
Usage Examples
Query Issues by Project
# List all issues from the SMQE project
result = await list_jira_issues(project="SMQE", limit=10)
Search Issues by Text
# Search for issues containing "authentication" in summary or description
result = await list_jira_issues(search_text="authentication", limit=20)
Get Specific Issue Details
# Get detailed information for a specific issue
result = await get_jira_issue_details(issue_key="SMQE-1280")
Get Project Overview
# Get statistics for all projects
result = await get_jira_project_summary()
Monitoring
When metrics are enabled, the server provides the following monitoring endpoints:
/metrics
- Prometheus metrics endpoint for scraping/health
- Health check endpoint returning JSON status
Available Metrics
mcp_tool_calls_total
- Counter of tool calls by tool name and statusmcp_tool_call_duration_seconds
- Histogram of tool call durationsmcp_active_connections
- Gauge of active MCP connectionsmcp_snowflake_queries_total
- Counter of Snowflake queries by statusmcp_snowflake_query_duration_seconds
- Histogram of Snowflake query durations
Data Privacy
This server is designed to work with non-personally identifiable information (non-PII) data only. The Snowflake tables should contain sanitized data with any sensitive personal information removed.
Security Considerations
- Environment Variables: Store sensitive information like
SNOWFLAKE_TOKEN
in environment variables, never in code - Token Security: Ensure your Snowflake token is kept secure and rotated regularly
- Network Security: Use HTTPS endpoints and secure network connections
- Access Control: Follow principle of least privilege for Snowflake database access
- SQL Injection Prevention: The server includes input sanitization to prevent SQL injection attacks
Dependencies
httpx
- HTTP client library for Snowflake API communicationfastmcp
- Fast MCP server frameworkprometheus_client
- Prometheus metrics client (optional, for monitoring)
Development
Code Structure
The project follows a modular architecture:
jira-mcp-snowflake/
├── src/
│ ├── mcp_server.py # Main entry point
│ ├── config.py # Configuration and environment variables
│ ├── database.py # Snowflake database operations
│ ├── tools.py # MCP tool implementations
│ └── metrics.py # Prometheus metrics (optional)
├── requirements.txt # Python dependencies
└── README.md # This file
Adding New Tools
To add new MCP tools:
- Add the tool function to
src/tools.py
- Decorate with
@mcp.tool()
and@track_tool_usage("tool_name")
- Follow the existing patterns for error handling and logging
- Update this README with documentation for the new tool
Related Servers
Targetprocess
Enables AI assistants to interact with Targetprocess data using semantic operations.
Google Calendar
An MCP server for Google Calendar, enabling LLMs to read, create, and manage calendar events.
Jira MCP Server
Integrates with Jira's REST API to manage issues programmatically.
Remote macOS Use
An open-source MCP server that allows AI to fully control a remote macOS system.
Rootly
MCP server for the incident management platform Rootly.
Cua
MCP server for the Computer-Use Agent (CUA), allowing you to run CUA through Claude Desktop or other MCP clients.
Folderr MCP Server
Interact with the Folderr API to manage and communicate with Folderr Assistants.
Task Orchestrator
AI-powered task orchestration and workflow automation with specialized agent roles, intelligent task decomposition, and seamless integration across Claude Desktop, Cursor IDE, Windsurf, and VS Code.
Reclaim AI
Reclaim is an ai calendar management tool that helps you plan your schedule and tasks.
Roam Research MCP Server
Access and manage your Roam Research graph via its API.