GitHub Actions
An MCP Server for the GitHub Actions API, enabling AI assistants to manage and operate GitHub Actions workflows.
GitHub Actions MCP Server
⚠️ Archive Notice: This repository will be archived soon as the official GitHub MCP server is adding Actions support. See github/github-mcp-server#491 for details on the official implementation.
MCP Server for the GitHub Actions API, enabling AI assistants to manage and operate GitHub Actions workflows. Compatible with multiple AI coding assistants including Claude Desktop, Codeium, and Windsurf.
Features
- Complete Workflow Management: List, view, trigger, cancel, and rerun workflows
- Workflow Run Analysis: Get detailed information about workflow runs and their jobs
- Comprehensive Error Handling: Clear error messages with enhanced details
- Flexible Type Validation: Robust type checking with graceful handling of API variations
- Security-Focused Design: Timeout handling, rate limiting, and strict URL validation
Tools
-
list_workflows- List workflows in a GitHub repository
- Inputs:
owner(string): Repository owner (username or organization)repo(string): Repository namepage(optional number): Page number for paginationperPage(optional number): Results per page (max 100)
- Returns: List of workflows in the repository
-
get_workflow- Get details of a specific workflow
- Inputs:
owner(string): Repository owner (username or organization)repo(string): Repository nameworkflowId(string or number): The ID of the workflow or filename
- Returns: Detailed information about the workflow
-
get_workflow_usage- Get usage statistics of a workflow
- Inputs:
owner(string): Repository owner (username or organization)repo(string): Repository nameworkflowId(string or number): The ID of the workflow or filename
- Returns: Usage statistics including billable minutes
-
list_workflow_runs- List all workflow runs for a repository or a specific workflow
- Inputs:
owner(string): Repository owner (username or organization)repo(string): Repository nameworkflowId(optional string or number): The ID of the workflow or filenameactor(optional string): Filter by user who triggered the workflowbranch(optional string): Filter by branchevent(optional string): Filter by event typestatus(optional string): Filter by statuscreated(optional string): Filter by creation date (YYYY-MM-DD)excludePullRequests(optional boolean): Exclude PR-triggered runscheckSuiteId(optional number): Filter by check suite IDpage(optional number): Page number for paginationperPage(optional number): Results per page (max 100)
- Returns: List of workflow runs matching the criteria
-
get_workflow_run- Get details of a specific workflow run
- Inputs:
owner(string): Repository owner (username or organization)repo(string): Repository namerunId(number): The ID of the workflow run
- Returns: Detailed information about the specific workflow run
-
get_workflow_run_jobs- Get jobs for a specific workflow run
- Inputs:
owner(string): Repository owner (username or organization)repo(string): Repository namerunId(number): The ID of the workflow runfilter(optional string): Filter jobs by completion status ('latest', 'all')page(optional number): Page number for paginationperPage(optional number): Results per page (max 100)
- Returns: List of jobs in the workflow run
-
trigger_workflow- Trigger a workflow run
- Inputs:
owner(string): Repository owner (username or organization)repo(string): Repository nameworkflowId(string or number): The ID of the workflow or filenameref(string): The reference to run the workflow on (branch, tag, or SHA)inputs(optional object): Input parameters for the workflow
- Returns: Information about the triggered workflow run
-
cancel_workflow_run- Cancel a workflow run
- Inputs:
owner(string): Repository owner (username or organization)repo(string): Repository namerunId(number): The ID of the workflow run
- Returns: Status of the cancellation operation
-
rerun_workflow- Re-run a workflow run
- Inputs:
owner(string): Repository owner (username or organization)repo(string): Repository namerunId(number): The ID of the workflow run
- Returns: Status of the re-run operation
Usage with AI Coding Assistants
This MCP server is compatible with multiple AI coding assistants including Claude Desktop, Codeium, and Windsurf.
Claude Desktop
First, make sure you have built the project (see Build section below). Then, add the following to your claude_desktop_config.json:
{
"mcpServers": {
"github-actions": {
"command": "node",
"args": [
"<path-to-mcp-server>/dist/index.js"
],
"env": {
"GITHUB_PERSONAL_ACCESS_TOKEN": "<YOUR_TOKEN>"
}
}
}
}
Codeium
Add the following configuration to your Codeium MCP config file (typically at ~/.codeium/windsurf/mcp_config.json on Unix-based systems or %USERPROFILE%\.codeium\windsurf\mcp_config.json on Windows):
{
"mcpServers": {
"github-actions": {
"command": "node",
"args": [
"<path-to-mcp-server>/dist/index.js"
],
"env": {
"GITHUB_PERSONAL_ACCESS_TOKEN": "<YOUR_TOKEN>"
}
}
}
}
Windsurf
Windsurf uses the same configuration format as Codeium. Add the server to your Windsurf MCP configuration as shown above for Codeium.
Build
Unix/Linux/macOS
Clone the repository and build:
git clone https://github.com/ko1ynnky/github-actions-mcp-server.git
cd github-actions-mcp-server
npm install
npm run build
Windows
For Windows systems, use the Windows-specific build command:
git clone https://github.com/ko1ynnky/github-actions-mcp-server.git
cd github-actions-mcp-server
npm install
npm run build:win
Alternatively, you can use the included batch file:
run-server.bat [optional-github-token]
This will create the necessary files in the dist directory that you'll need to run the MCP server.
Windows-Specific Instructions
Prerequisites
- Node.js (v14 or higher)
- npm (v6 or higher)
Running the Server on Windows
-
Using the batch file (simplest method):
run-server.bat [optional-github-token]This will check if the build exists, build if needed, and start the server.
-
Using npm directly:
npm run start
Setting GitHub Personal Access Token on Windows
For full functionality and to avoid rate limiting, you need to set your GitHub Personal Access Token.
Options:
-
Pass it as a parameter to the batch file:
run-server.bat your_github_token_here -
Set it as an environment variable:
set GITHUB_PERSONAL_ACCESS_TOKEN=your_github_token_here npm run start
Troubleshooting Windows Issues
If you encounter issues:
-
Build errors: Make sure TypeScript is installed correctly.
npm install -g typescript -
Permission issues: Ensure you're running the commands in a command prompt with appropriate permissions.
-
Node.js errors: Verify you're using a compatible Node.js version.
node --version
Usage Examples
List workflows in a repository:
const result = await listWorkflows({
owner: "your-username",
repo: "your-repository"
});
Trigger a workflow:
const result = await triggerWorkflow({
owner: "your-username",
repo: "your-repository",
workflowId: "ci.yml",
ref: "main",
inputs: {
environment: "production"
}
});
Troubleshooting
Common Issues
-
Authentication Errors:
- Ensure your GitHub token has the correct permissions
- Check that the token is correctly set as an environment variable
-
Rate Limiting:
- The server implements rate limiting to avoid hitting GitHub API limits
- If you encounter rate limit errors, reduce the frequency of requests
-
Type Validation Errors:
- GitHub API responses might sometimes differ from expected schemas
- The server implements flexible validation to handle most variations
- If you encounter persistent errors, please open an issue
License
This MCP server is licensed under the MIT License.
Servidores relacionados
Scout Monitoring MCP
patrocinadorPut performance and error data directly in the hands of your AI assistant.
Alpha Vantage MCP Server
patrocinadorAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
Praison AI
AI Agents framework with 64+ built-in MCP tools for search, memory, workflows, code execution, and file operations. Install via `uvx praisonai-mcp`
mcp-sync
Sync MCP server configurations across various AI coding tools.
Web Accessibility MCP Server
An MCP server that provides web accessibility analysis capabilities using axe-core and Puppeteer.
Axone MCP
A lightweight server exposing Axone's capabilities through the Model-Context Protocol.
Aider MCP Server
An MCP server for offloading AI coding tasks to Aider, enhancing development efficiency and flexibility.
Terraform MCP Server by Binadox
MCP server for Terraform — automatically validates, secures, and estimates cloud costs for Terraform configurations. Developed by Binadox, it integrates with any Model Context Protocol (MCP) client (e.g. Claude Desktop or other MCP-compatible AI assistants).
Language Server
MCP Language Server gives MCP enabled clients access to semantic tools like get definition, references, rename, and diagnostics.
Srclight
Deep code indexing for AI agents — 25 MCP tools: hybrid FTS5 + embedding search, call graphs, git blame/hotspots, build system analysis. Multi-repo workspaces, GPU-accelerated semantic search, 10 languages. Fully local, zero cloud dependencies.
EOL MCP Server
Check software end-of-life (EOL) dates and support status using the endoflife.date API to provide accurate lifecycle and security information.
BrainBox
Hebbian memory for AI agents — learns file access patterns, builds neural pathways, predicts next tools/files, saves tokens
