WinCC Unified MCP XT
An MCP server for interfacing with SIEMENS WinCC Unified SCADA systems via their GraphQL API.
WinCC Unified MCP XT
A Model Context Protocol (MCP) server designed to interface with SIEMENS WinCC Unified SCADA systems via their GraphQL API. This server exposes various WinCC Unified functionalities as MCP tools, enabling AI assistants and other MCP-compatible clients to interact programmatically with the SCADA system.
This project is based on the repository by Andreas Vogler
🔧 Features
- Connects to a WinCC Unified GraphQL endpoint.
- Provides MCP tools for:
- ✅ User authentication (
login-user) - 📂 Browsing SCADA objects (
browse-objects) - 📊 Reading current tag values (
get-tag-values) - 🕒 Querying historical/logged tag data (
get-logged-tag-values) - 🚨 Fetching active alarms (
get-active-alarms) - 📁 Fetching logged alarms (
get-logged-alarms) - ✍️ Writing values to tags (
write-tag-values) - 🟢 Acknowledging alarms (
acknowledge-alarms) - 🔄 Resetting alarms (
reset-alarms)
- ✅ User authentication (
- Optional automatic service account login with token refresh mechanism.
⚙️ Prerequisites
- Node.js (v18.x or newer recommended)
- npm (comes with Node.js)
- Access to a running WinCC Unified GraphQL server endpoint
⚙️ Configuration
this server uses a config.js file written in ES module syntax.
config.js (ESM) example:
export const config = {
URL: "https://your-wincc-server.example.com/graphql", // required
userName: "service_account_username", // optional
pwr: "service_account_password", // optional
};
🚀 How to Start
- Navigate to the project folder:
cd your-project-directory
- Install dependencies:
npm install
-
Edit config.js as shown above.
-
Start the server
node start
🖥️ Connecting with Claude Desktop
To use this MCP server with Claude AI (desktop version):
-
Find or create the claude_desktop_config.json file (typically in the Claude app config folder).
-
Add or update the following:
{
"mcpServers": {
"WinCC Unified": {
"command": "npx",
"args": ["mcp-remote", "http://localhost:3000/mcp"]
}
}
}
- Ensure @modelcontextprotocol/tools is installed:
npm install -g @modelcontextprotocol/tools
🧰 Available MCP Tools
| Tool | Description |
|---|---|
login-user | Logs in with username/password. |
browse-objects | Browses configured SCADA elements. |
get-tag-values | Retrieves live tag values. |
get-logged-tag-values | Gets historical tag data. |
get-active-alarms | Lists currently active alarms. |
get-logged-alarms | Shows previously triggered alarms. |
write-tag-values | Updates one or more tags. |
acknowledge-alarms | Acknowledges alarms. |
reset-alarms | Resets alarms. |
📝 Notes
-
If configured, a service account is automatically logged in and token refreshed every minute.
-
A user's manual login overrides the service session temporarily.
เซิร์ฟเวอร์ที่เกี่ยวข้อง
Alpha Vantage MCP Server
ผู้สนับสนุนAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
Image Generator
Generate and save images using the Replicate API.
Tailwind Svelte Assistant
Provides documentation and code snippets for SvelteKit and Tailwind CSS.
OpenAI GPT Image
Generate and edit images using OpenAI's GPT-4o image generation and editing APIs with advanced prompt control.
MCP Framework
A TypeScript framework for building Model Context Protocol (MCP) servers.
xcodebuild
🍎 Build iOS Xcode workspace/project and feed back errors to llm.
Figma
Interact with Figma files to view, comment on, and analyze designs.
Text2Sim MCP Server
A multi-paradigm simulation engine for Discrete-Event and System Dynamics, enabling natural language-based simulations via MCP.
Tulip MCP Server
An MCP server for the Tulip API, allowing LLMs to interact with the Tulip manufacturing platform's tables, records, machines, and more.
Pinelabs MCP Server
The Pine Labs Online MCP Server implements the Model Context Protocol (MCP) to enable seamless integration between Pine Labs’ online payment APIs and AI tools. It allows AI assistants to perform Pine Labs Online API operations, empowering developers to build intelligent, AI-driven payment applications with ease.
context-mem
Context optimization for AI coding assistants — 99% token savings via 14 content-aware summarizers, 3-layer search, and progressive disclosure. No LLM dependency.