MCP Kubernetes Server
Control Kubernetes clusters through interactions with Large Language Models (LLMs).
MCP Kubernetes Server
This is an MCP (Model Context Protocol) server for Kubernetes that provides control over Kubernetes clusters through interactions with LLMs.
Overview
This client allows you to perform common Kubernetes operations through MCP tools. It wraps kubectl commands to provide a simple interface for managing Kubernetes resources. The Model Context Protocol (MCP) enables seamless interaction between language models and Kubernetes operations.
What is MCP?
Model Context Protocol (MCP) is a framework that enables Language Models to interact with external tools and services in a structured way. It provides:
- A standardized way to expose functionality to language models
- Context management for operations
- Tool discovery and documentation
- Type-safe interactions between models and tools
Usage Examples
- Create a new deployment for me with name nginx-app and image nginx:latest in the production namespace with 3 replicas.
- Update the deployment nginx-app to version 1.19 in the production namespace.
- Scale the deployment nginx-app to 5 replicas in the production namespace.
- Get me the pods in the production namespace.
- Get me all namespaces in the cluster.
- Get me all nodes in the cluster.
- Get me all services in the cluster.
- Get me all deployments in the cluster.
- Get me all jobs in the cluster.
- Get me all cronjobs in the cluster.
- Get me all statefulsets in the cluster.
- Get me all daemonsets in the cluster.
- What is the current context.
- list all contexts.
- switch to context .
- Get me the logs of pod in the production namespace.
- Get me the events in the production namespace.
- annotate pod with key1=value1 in the production namespace.
- remove annotation key1 from pod in the production namespace.
- add label key1=value1 to pod in the production namespace.
- remove label key1 from pod in the production namespace.
- expose deployment nginx-app in the production namespace on port 80.
- port-forward pod,deployment,service with name in the production namespace to local port 8080.
- delete pod, deployment, service, job, cronjob, statefulset, daemonset with name in the production namespace.
Upcoming Features
- Create cluster role.
- delete cluster role.
- create cluster role binding.
- delete cluster role binding.
- create namespace.
- delete namespace.
- create service account.
- delete service account.
- create role.
- delete role.
- create role binding.a
- delete role binding.
LLM Integration
This MCP client is designed to work seamlessly with Large Language Models (LLMs). The functions are decorated with @mcp.tool(), making them accessible to LLMs through the Model Context Protocol framework.
Example LLM Prompts
LLMs can interact with your Kubernetes cluster using natural language. Here are some example prompts:
- "Create a new nginx deployment with 3 replicas in the production namespace"
- "Scale the nginx-app deployment to 5 replicas"
- "Update the image of nginx-app to version 1.19"
The LLM will interpret these natural language requests and call the appropriate MCP functions with the correct parameters.
Benefits of LLM Integration
- Natural Language Interface: Manage Kubernetes resources using conversational language
- Reduced Command Complexity: No need to remember exact kubectl syntax
- Error Prevention: LLMs can validate inputs and provide helpful error messages
- Context Awareness: LLMs can maintain context across multiple operations
- Structured Interactions: MCP ensures type-safe and documented interactions between LLMs and tools
Requirements
- Kubernetes cluster access configured via
kubectl - Python 3.x
- MCP framework installed and configured
Security Note
When using this client with LLMs, ensure that:
- Proper access controls are in place for your Kubernetes cluster
- The MCP server is running in a secure environment
- API access is properly authenticated and authorized
Usage with Claude Desktop
{
"mcpServers": {
"Kubernetes": {
"command": "uv",
"args": [
"--directory",
"~/mcp/mcp-k8s-server",
"run",
"kubernetes.py"
]
}
}
}
Contributing
We welcome contributions to the MCP Kubernetes Server! If you'd like to contribute:
- Fork the repository
- Create a new branch for your feature (
git checkout -b feature/amazing-feature) - Make your changes
- Write or update tests as needed
- Commit your changes (
git commit -m 'Add some amazing feature') - Push to your branch (
git push origin feature/amazing-feature) - Open a Pull Request
For major changes, please open an issue first to discuss what you would like to change.
Installing via Smithery
To install Kubernetes Server for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @abhijeetka/mcp-k8s-server --client claude
เซิร์ฟเวอร์ที่เกี่ยวข้อง
CML MCP Server
An MCP server for interacting with Cloudera Machine Learning (CML).
China Weather
Query weather information and meteorological warnings for Chinese cities using the QWeather API.
EdgeOne Geo Location Service
Provides user geolocation data via Tencent EdgeOne Pages Functions, enabling large language models to access location information.
CData Zuora MCP Server
An MCP server for Zuora, powered by the CData JDBC Driver. Requires a separate driver and configuration file for connection.
ThingsPanel MCP
An MCP server for interacting with the ThingsPanel IoT platform.
招投标大数据服务
Provides cloud migration services, including asset usage analysis, technology stack evaluation, and migration planning.
AWS MCP
Interact with your AWS environment using natural language to query and manage resources. Requires local AWS credentials.
Etherscan
Interact with the Etherscan API to explore blockchain data and services.
Nexlayer MCP
Agentic cloud platform with 45+ MCP tools. Deploy any containerized stack, debug live pods (shell, file editing, DB queries), manage custom domains & TLS, push to built-in container registry, scale pods, and manage GPU workloads. The infrastructure layer where AI agents ship software to production.
Remote MCP Server on Cloudflare
A remote MCP server deployable on Cloudflare Workers with OAuth login support.