WinTerm MCP
Provides programmatic access to the Windows terminal, enabling AI models to interact with the command line interface.
WinTerm MCP
A Model Context Protocol server that provides programmatic access to the Windows terminal. This server enables AI models to interact with the Windows command line interface through a set of standardized tools.
Features
- Write to Terminal: Execute commands or write text to the Windows terminal
- Read Terminal Output: Retrieve output from previously executed commands
- Send Control Characters: Send control signals (e.g., Ctrl+C) to the terminal
- Windows-Native: Built specifically for Windows command line interaction
Installation
-
Clone the Repository:
git clone https://github.com/capecoma/winterm-mcp.git cd winterm-mcp -
Install Dependencies:
npm install -
Build the Project:
npm run build -
Configure Claude Desktop:
Add the server config to %APPDATA%/Claude/claude_desktop_config.json:
{
"mcpServers": {
"github.com/capecoma/winterm-mcp": {
"command": "node",
"args": ["path/to/build/index.js"],
"disabled": false,
"autoApprove": []
}
}
}
Note: Replace "path/to/build/index.js" with the actual path to your built index.js file.
Available Tools
write_to_terminal
Writes text or commands to the terminal.
{
"command": "echo Hello, World!"
}
read_terminal_output
Reads the specified number of lines from terminal output.
{
"linesOfOutput": 5
}
send_control_character
Sends a control character to the terminal (e.g., Ctrl+C).
{
"letter": "C"
}
Development
For development with auto-rebuild:
npm run dev
License
MIT License - see LICENSE file.
관련 서버
Scout Monitoring MCP
스폰서Put performance and error data directly in the hands of your AI assistant.
Alpha Vantage MCP Server
스폰서Access financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
Headless IDA MCP Server
Analyze binary files and manage functions and variables using IDA Pro's headless mode.
LLMling
An MCP server with an LLMling backend that uses YAML files to configure LLM applications.
GZOO Cortex
Local-first knowledge graph for developers. Watches project files, extracts entities and relationships via LLMs, and lets you query across projects with natural language and source citations.
Assistant MCP Server
An MCP server that dynamically loads tools from an external JSON file configured via an environment variable.
Engram
Prevents regression by providing Blast Radius data to AI based on your git history
MCP‑Stack
A Docker Compose-based collection of MCP servers for LLM workflows, featuring centralized configuration and management scripts.
Node Omnibus MCP Server
An MCP server providing advanced Node.js development tooling and automation.
nUR MCP Server
An intelligent robot control middleware for natural language interaction with industrial robots, powered by LLMs. It integrates with Universal Robots and supports real-time, multi-robot control.
MCP RAG Server
A Python server providing Retrieval-Augmented Generation (RAG) functionality. It indexes various document formats and requires a PostgreSQL database with pgvector.
NPM Sentinel MCP
An AI-powered MCP server for analyzing NPM package security, dependencies, and performance.