MCP Jenkins
Enables secure, contextual AI interactions with Jenkins tools via the Model Context Protocol.
MCP Jenkins
The Model Context Protocol (MCP) is an open-source implementation that bridges Jenkins with AI language models following Anthropic's MCP specification. This project enables secure, contextual AI interactions with Jenkins tools while maintaining data privacy and security.
Cursor Demo
Setup Guide
Installation
Choose one of these installation methods:
# Using uv (recommended)
pip install uv
uvx mcp-jenkins
# Using pip
pip install mcp-jenkins
# Using Smithery
npx -y @smithery/cli@latest install @lanbaoshen/mcp-jenkins --client claude
Docker Installation
Pull the latest image from GitHub Container Registry:
docker pull ghcr.io/lanbaoshen/mcp-jenkins:latest
Configuration and Usage
Cursor
- Open Cursor Settings
- Navigate to MCP
- Click + Add new global MCP server
This will create or edit the ~/.cursor/mcp.json file with your MCP server configuration.
{
"mcpServers": {
"mcp-jenkins": {
"command": "uvx",
"args": [
"mcp-jenkins",
"--jenkins-url=xxx",
"--jenkins-username=xxx",
"--jenkins-password=xxx"
]
}
}
}
Note: You can set your Jenkins token to password too!
VSCode Copilot Chat
- Create
.vscodefolder withmcp.jsonfile in you workspace for local setup or editsettings.jsontrough settings menù. - Insert the following configuration:
- SSE mode
{
"servers": {
"jenkins": {
"url": "http://localhost:9887/sse",
"type": "sse"
}
}
}
- Streamable-Http mode
{
"servers": {
"mcp-jenkins-mcp": {
"autoApprove": [],
"disabled": false,
"timeout": 60,
"type": "streamableHttp",
"url": "http://localhost:9887/mcp"
}
}
}
- Run the Jenkins MCP server with the following command:
uvx mcp-jenkins \
--jenkins-url https://jenkins.example.com \
--jenkins-username your_username \
--jenkins-password your_password_or_token \
--transport sse --port 9887
line arguments
# Stdio Mode
uvx mcp-jenkins --jenkins-url xxx --jenkins-username xxx --jenkins-password xxx --read-only
# SSE Mode
uvx mcp-jenkins --jenkins-url xxx --jenkins-username xxx --jenkins-password xxx --transport sse --port 9887
# Streamable-Http Mode
uvx mcp-jenkins --jenkins-url xxx --jenkins-username xxx --jenkins-password xxx --transport streamable-http --port 9887
AutoGen
Install autogen:
pip install "autogen-ext[azure,ollama,openai,mcp]" autogen-chat
Run python scripts:
import asyncio
from autogen_ext.tools.mcp import StdioMcpToolAdapter, StdioServerParams
from autogen_agentchat.agents import AssistantAgent
from autogen_agentchat.ui import Console
from autogen_core import CancellationToken
async def main() -> None:
# Create server params for the remote MCP service
server_params = StdioServerParams(
command='uvx',
args=[
'mcp-jenkins',
'--jenkins-username',
'xxx',
'--jenkins-password',
'xxx',
'--jenkins-url',
'xxx'
],
)
# Get the translation tool from the server
adapter = await StdioMcpToolAdapter.from_server_params(server_params, 'get_all_jobs')
# Create an agent that can use the translation tool
agent = AssistantAgent(
name='jenkins_assistant',
model_client=[Replace_with_your_model_client],
tools=[adapter],
)
# Let the agent translate some text
await Console(
agent.run_stream(task='Get all jobs', cancellation_token=CancellationToken())
)
if __name__ == "__main__":
asyncio.run(main())
Available Tools
| Tool | Description |
|---|---|
| get_all_jobs | Get all jobs |
| get_job_config | Get job config |
| search_jobs | Search job by specific field |
| get_running_builds | Get running builds |
| stop_build | Stop running build |
| get_build_info | Get build info |
| get_build_sourcecode | Get the pipeline source code of a specific build in Jenkins |
| get_job_info | Get job info |
| build_job | Build a job with param |
| get_build_logs | Get build logs |
| get_test_results | Get test results for a specific build |
| get_all_nodes | Get nodes |
| get_node_config | Get the config of node |
| get_all_queue_items | Get all queue items |
| get_queue_item | Get queue item info |
| cancel_queue_item | Cancel queue item |
| get_multibranch_jobs | Get all multibranch pipeline jobs from Jenkins, optionally filtered by patterns |
| get_multibranch_branches | Get all branches for a specific multibranch pipeline job |
| scan_multibranch_pipeline | Trigger a scan of a multibranch pipeline to discover new branches |
Development & Debugging
# Using MCP Inspector
# For installed package
npx @modelcontextprotocol/inspector uvx mcp-jenkins --jenkins-url xxx --jenkins-username xxx --jenkins-password xxx
# For local development version
npx @modelcontextprotocol/inspector uv --directory /path/to/your/mcp-jenkins run mcp-jenkins --jenkins-url xxx --jenkins-username xxx --jenkins-password xxx
Pre-Commit Hook
# Install Dependency
uv sync --all-extras --dev
pre-commit install
# Manually execute
pre-commit run --all-files
UT
# Install Dependency
uv sync --all-extras --dev
# Execute UT
uv run pytest --cov=mcp_jenkins
License
Licensed under MIT - see LICENSE file. This is not an official Jenkins product.
MCP-Jenkins in MCP Registries
- https://mcpreview.com/mcp-servers/lanbaoshen/mcp-jenkins
- https://smithery.ai/server/@lanbaoshen/mcp-jenkins
- https://glama.ai/mcp/servers/@lanbaoshen/mcp-jenkins
- https://mseep.ai/app/lanbaoshen-mcp-jenkins
Star History
Related Servers
Scout Monitoring MCP
sponsorPut performance and error data directly in the hands of your AI assistant.
Alpha Vantage MCP Server
sponsorAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
BerryRAG
A local RAG system with Playwright MCP integration for Claude and OpenAI embeddings, using local storage.
Calva Backseat Driver
An MCP server for the Calva VS Code extension, allowing AI assistants to interact with a live Clojure REPL.
ExMCP Test Server
An Elixir-based MCP server for testing and experimenting with the Model Context Protocol.
Bruno MCP Server
Execute Bruno collections using the Bruno CLI, with support for environment files and detailed test results.
MCP Playground
A demonstration MCP server implementation in Go featuring real-time bidirectional file communication.
LLMling
An MCP server with an LLMling backend that uses YAML files to configure LLM applications.
DevCycle
Turn your favourite AI tool into a feature management assistant. DevCycle's MCP works with your favourite coding assistant so you can create and monitor feature flags using natural language right in your workflow.
Root Signals
Equip AI agents with evaluation and self-improvement capabilities with Root Signals.
Remote MCP Server (Authless)
An example of a remote MCP server deployable on Cloudflare Workers without authentication.
Roo Activity Logger
Automatically logs AI coding assistant activities, such as command executions and code generation, into searchable JSON files.