A server to interact with the Uyuni Server API for infrastructure and configuration management.
Model Context Protocol Server for Uyuni Server API.
This is an open-source project provided "AS IS" without any warranty, express or implied. Use at your own risk. For full details, please refer to the License section.
There are two main ways to run the mcp-server-uyuni
: using the pre-built Docker container or running it locally with uv
. Both methods require a config
file.
Before running the server, you need to create a config
file. You can place it anywhere, but you must provide the correct path to it when running the server.
UYUNI_SERVER=192.168.1.124:8443
UYUNI_USER=admin
UYUNI_PASS=admin
# Optional: Set to 'false' to disable SSL certificate verification. Defaults to 'true'.
# UYUNI_SSL_VERIFY=false
# Optional: Set to 'true' to enable tools that perform write actions (e.g., POST requests). Defaults to 'false'.
# UYUNI_MCP_WRITE_TOOLS_ENABLED=false
# Optional: Set the transport protocol. Can be 'stdio' (default) or 'http'.
# UYUNI_MCP_TRANSPORT=stdio
> [!WARNING]
> **Security Note on HTTP Transport:** When `UYUNI_MCP_TRANSPORT` is set to `http`, the server runs without authentication. This means any client with network access can execute commands. Only use this mode in a trusted, isolated network environment. For more details, see the Security Policy.
> [!WARNING]
> **Security Note on Write Tools:** Enabling `UYUNI_MCP_WRITE_TOOLS_ENABLED` allows the execution of state-changing and potentially destructive actions (e.g., removing systems, applying updates). When combined with `UYUNI_MCP_TRANSPORT=http`, this risk is amplified, as any client with network access can perform these actions. Only enable write tools in a trusted environment.
# Optional: Set the path for the server log file. Defaults to logging to the console.
# UYUNI_MCP_LOG_FILE_PATH=/var/log/mcp-server-uyuni.log
UYUNI_SSH_PRIV_KEY="-----BEGIN OPENSSH PRIVATE KEY-----\n..."
UYUNI_SSH_PRIV_KEY_PASS=""
Replace the values with your Uyuni server details. This file contains sensitive information and should not be committed to version control.
[!NOTE] Formatting the SSH Private Key
The
UYUNI_SSH_PRIV_KEY
variable, used by theadd_system
tool, requires the entire private key as a single-line string. The newlines from the original key file must be replaced by the literal\n
sequence.You can generate the correct format from your key file (e.g.,
~/.ssh/id_rsa
) using the following command. You can then copy the output into yourconfig
file or environment variable.awk 'NF {printf "%s\\n", $0}' ~/.ssh/id_rsa
To set it as an environment variable directly in your shell, run:
export UYUNI_SSH_PRIV_KEY=$(awk 'NF {printf "%s\\n", $0}' ~/.ssh/id_rsa)
Alternatively, you can also set environment variables instead of using a file.
You can run (docker option)
npx @modelcontextprotocol/inspector docker run -i --rm --env-file /path/to/your/config ghcr.io/uyuni-project/mcp-server-uyuni:latest
or you can run (uv option)
npx @modelcontextprotocol/inspector uv run --env-file=.venv/config --directory . mcp-server-uyuni
Open WebUI is an extensible, feature-rich, and user-friendly self-hosted AI platform designed to operate entirely offline. It supports various LLM runners like Ollama and OpenAI-compatible APIs, with built-in inference engine for RAG, making it a powerful AI deployment solution. More at https://docs.openwebui.com/
[!NOTE] The following instructions describe how to set up Open WebUI and the MCP proxy for local development and testing purposes. For production deployments, please refer to the official Open WebUI documentation for recommended setup procedures.
You need uv
installed. See https://docs.astral.sh/uv
Start v0.6.10 (for MCP support we need a version >= 0.6.7)
uv tool run open-webui@0.6.10 serve
Configure the OpenAI API URL by following these instructions:
https://docs.openwebui.com/getting-started/quick-start/starting-with-openai
For gemini, use the URL https://generativelanguage.googleapis.com/v1beta/openai and get the token API from the Google AI Studio https://aistudio.google.com/
First, ensure you have your config
file ready as described in the Usage section.
Then, you need a config.json
for the MCP to OpenAPI proxy server.
This is the easiest method for deployment. Pre-built container images are available on the GitHub Container Registry.
Replace /path/to/your/config
with the absolute path to your config
file. Replace VERSION
with the desired release tag (e.g., v0.2.1
) or use latest
for the most recent build from the main
branch.
{
"mcpServers": {
"mcp-server-uyuni": {
"command": "docker",
"args": [
"run", "-i", "--rm",
"--env-file", "/path/to/your/config",
"ghcr.io/uyuni-project/mcp-server-uyuni:VERSION"
]
}
}
}
Alternatively, you can use environment variables instead of a file.
{
"mcpServers": {
"mcp-server-uyuni": {
"command": "docker",
"args": [
"run", "-i", "--rm",
"-e", "UYUNI_SERVER=192.168.1.124:8443",
"-e", "UYUNI_USER=admin",
"-e", "UYUNI_PASS=admin",
"ghcr.io/uyuni-project/mcp-server-uyuni:VERSION"
]
}
}
}
uv
This method is ideal for development.
uv
: See https://docs.astral.sh/uvuv sync
/path/to/your/config
with the absolute path to your config
file.{
"mcpServers": {
"mcp-server-uyuni": {
"command": "uv",
"args": [
"run",
"--env-file", "/path/to/your/config",
"--directory", ".",
"mcp-server-uyuni"
]
}
}
}
Then, you can start the Model Context Protocol to Open API proxy server:
uvx mcpo --port 9000 --config ./config.json
And then you can add the tool to the Open Web UI. See https://docs.openwebui.com/openapi-servers/open-webui#step-2-connect-tool-server-in-open-webui .
Note the url should be http://localhost/mcp-server-uyuni as explained in https://docs.openwebui.com/openapi-servers/open-webui#-optional-using-a-config-file-with-mcpo
[!NOTE] The Model Context Protocol (MCP) includes advanced features like Elicitation, which allows tools to interactively prompt the user for missing information or confirmation.
As of this writing, not all MCP clients support this capability. For example, Open WebUI does not currently implement elicitation.
To test tools that leverage elicitation (like the
add_system
tool when an activation key is missing), you need a compatible client. The official MCP extension for Visual Studio Code is a reference client that fully supports elicitation and is recommended for developing and testing these features.
To build the Docker image locally for development or testing purposes:
docker build -t mcp-server-uyuni .
Then, you can use docker run -i --rm --env-file .venv/config mcp-server-uyuni
at any of the mcp-client configurations explained above.
To create a new release for mcp-server-uyuni
, follow these steps.
git fetch upstream && git checkout upstream/main -b release-x.y.z
. Assuming upstream is the remote alias for the upstream gitREADME.md
):
srv/mcp-server-uyuni/server.py
.docs/
directory and their references in this README.md
to reflect the latest UI or functionality, if necessary.TEST_CASES.md
):
TEST_CASES.md
.Status (vX.Y.Z)
).Pass
, Fail
, Blocked
, or N/A
status for each test case in the new version column.README.md
, TEST_CASES.md
, and any other changed files.uv lock
to update uv.lock file with the version set in pyproject.tomlconventional-changelog-cli
. If you don't have it installed globally, you can use npx
.conventionalcommits
preset and output it to CHANGELOG.md
(prepending the new changes) is:
npx conventional-changelog-cli -p conventionalcommits -i CHANGELOG.md -s
CHANGELOG.md
for accuracy and formatting.CHANGELOG.md
.git push origin release-x.y.z
. Assuming origin is the remote alias for your git fork.git tag vX.Y.Z
). Follow semantic versioning rules.git push && git push --tags
).ghcr.io
) with tags for the specific version (e.g., v0.3.0
) and major.minor (e.g., v0.3
). Pushing to main
will update the latest
tag.ghcr.io
and run the tests in TEST_CASES.md
against it.
docker run -i --rm --env-file .venv/config ghcr.io/uyuni-project/mcp-server-uyuni:VERSION
(replace VERSION with the new tag).We would love to hear from you! Any idea you want to discuss or share, please do so at https://github.com/uyuni-project/uyuni/discussions/10562
If you encounter any bug, be so kind to open a new bug report at https://github.com/uyuni-project/mcp-server-uyuni/issues/new?type=bug
Thanks in advance from the uyuni team!
This project is licensed under the Apache License, Version 2.0. See the LICENSE file for details.
Interact with the Eyevinn Open Source Cloud API. Requires a Personal Access Token (OSC_ACCESS_TOKEN).
APISIX Model Context Protocol (MCP) server is used to bridge large language models (LLMs) with the APISIX Admin API, supporting querying and managing all resources in Apache APISIX.
Access global weather forecasts and historical data through the Open-Meteo API.
A template for deploying a remote, authentication-free MCP server on Cloudflare Workers. Tools are defined directly in the source code.
A remote MCP server without authentication, deployable on Cloudflare Workers.
Interact with your content on the Contentful platform
Provides cloud migration services, including asset usage analysis, technology stack evaluation, and migration planning.
Integrates with the Uberall API to manage business listings, locations, and social media presence.
A remote MCP server deployable on Cloudflare Workers without authentication.
A remote, authentication-free MCP server deployable on Cloudflare Workers or locally via npm.