An MCP server that provides system information, such as CPU and memory usage.
Model Context Protocol (MCP) сервер для получения системной информации (CPU и память).
https://modelcontextprotocol.io/specification/2025-03-26/basic/transports#streamable-http
https://levelup.gitconnected.com/mcp-server-and-client-with-sse-the-new-streamable-http-d860850d9d9d
/
/sse
Сервер использует структурированное логгирование с помощью zerolog с поддержкой следующих переменных окружения:
LOG_LEVEL
- уровень логгирования: trace
, debug
, info
, warn
, error
, fatal
, panic
, disabled
(по умолчанию: info
)ENVIRONMENT
или ENV
- режим окружения: development
/dev
или production
/prod
(по умолчанию: development
)ENVIRONMENT=development LOG_LEVEL=debug ./system-info-server
ENVIRONMENT=production LOG_LEVEL=info ./system-info-server
# Минимальное логгирование для продакшена
ENVIRONMENT=production LOG_LEVEL=error PORT=8080 ./system-info-server
# Максимальная детализация для отладки
ENVIRONMENT=development LOG_LEVEL=trace ./system-info-server
# Стандартная конфигурация для разработки
LOG_LEVEL=debug ./system-info-server
Каждое логируемое событие содержит контекстные поля:
component
- компонент системы (main, http, session, mcp, tools, sysinfo, sse, streamable)session_id
- идентификатор сессии для отслеживания запросовmethod
- HTTP метод или RPC методduration
- время выполнения операцийstatus
- HTTP статус кодerror
- детали ошибок с контекстомПример лога в режиме разработки:
14:30:25 INF Starting Fiber server component=main port=8080 addr=:8080
14:30:30 INF Request started component=http method=POST path=/ session_id=session_20240614_143030_abc12345
14:30:30 DBG Processing JSON-RPC request component=mcp method=initialize session_id=session_20240614_143030_abc12345
Пример лога в JSON формате (продакшен):
{"level":"info","time":"2024-06-14T14:30:25+03:00","caller":"main.go:65","component":"main","port":"8080","addr":":8080","message":"Starting Fiber server"}
{"level":"info","time":"2024-06-14T14:30:30+03:00","caller":"middleware/logging.go:35","component":"http","method":"POST","path":"/","session_id":"session_20240614_143030_abc12345","message":"Request started"}
go build -o system-info-server .
./system-info-server
PORT=8080 ./system-info-server
Добавьте в файл ~/.cursor/mcp.json
:
{
"mcpServers": {
"system-info-local": {
"command": "/path/to/system-info-server",
"args": []
}
}
}
{
"mcpServers": {
"system-info-remote": {
"url": "https://your-domain.com/"
}
}
}
{
"mcpServers": {
"system-info-legacy": {
"url": "https://your-domain.com/sse"
}
}
}
При добавлении MCP сервера в n8n укажите:
https://your-domain.com/
При добавлении MCP сервера в n8n укажите:
https://your-domain.com/sse
POST /
Content-Type: application/json
Accept: application/json, text/event-stream
{
"jsonrpc": "2.0",
"id": 1,
"method": "initialize",
"params": {
"protocolVersion": "2025-03-26",
"capabilities": {},
"clientInfo": {
"name": "client-name",
"version": "1.0.0"
}
}
}
Ответ содержит заголовок Mcp-Session-Id
, который нужно использовать во всех последующих запросах.
POST /
Content-Type: application/json
Accept: application/json
Mcp-Session-Id: <session-id>
{
"jsonrpc": "2.0",
"id": 2,
"method": "tools/list"
}
POST /
Content-Type: application/json
Accept: application/json
Mcp-Session-Id: <session-id>
{
"jsonrpc": "2.0",
"id": 3,
"method": "tools/call",
"params": {
"name": "get_system_info",
"arguments": {}
}
}
GET /
Accept: text/event-stream
Mcp-Session-Id: <session-id>
POST /
Content-Type: application/json
Accept: text/event-stream
Mcp-Session-Id: <session-id>
{
"jsonrpc": "2.0",
"id": 4,
"method": "tools/call",
"params": {
"name": "get_system_info",
"arguments": {}
}
}
DELETE /
Mcp-Session-Id: <session-id>
POST /sse
Content-Type: application/json
{
"jsonrpc": "2.0",
"id": 1,
"method": "initialize",
"params": {
"protocolVersion": "2024-11-05",
"capabilities": {},
"clientInfo": {
"name": "client-name",
"version": "1.0.0"
}
}
}
GET /sse?sessionId=<session-id>
Accept: text/event-stream
docker build -t mcp-system-info .
# HTTP режим
docker run -p 8080:8080 -e PORT=8080 mcp-system-info
# stdio режим
docker run -it mcp-system-info
docker-compose up -d
При развертывании за nginx добавьте в конфигурацию:
# Для нового Streamable HTTP endpoint
location / {
proxy_pass http://localhost:8080;
proxy_http_version 1.1;
proxy_set_header Connection "";
proxy_buffering off;
proxy_cache off;
proxy_set_header X-Accel-Buffering no;
proxy_read_timeout 86400;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
# Для Legacy SSE endpoint (обратная совместимость)
location /sse {
proxy_pass http://localhost:8080/sse;
proxy_http_version 1.1;
proxy_set_header Connection "";
proxy_buffering off;
proxy_cache off;
proxy_set_header X-Accel-Buffering no;
proxy_read_timeout 86400;
}
Новая спецификация (/):
Accept: application/json, text/event-stream
Mcp-Session-Id
заголовокLast-Event-Id
Legacy спецификация (/sse):
endpoint
при подключении к SSEsessionId
в query stringMIT
MCP server for the Computer-Use Agent (CUA), allowing you to run CUA through Claude Desktop or other MCP clients.
Interact with data in Attio, the AI-native CRM, enabling AI assistants to access and manage your customer relationship information.
An MCP server for interacting with a Paperless-NGX API server. This server provides tools for managing documents, tags, correspondents, and document types in your Paperless-NGX instance.
Interact with the HubSpot CRM API to manage contacts, companies, and deals.
Official Taskeract MCP Server for integrating your Taskeract project tasks and load the context of your tasks into your MCP enabled app.
An MCP Server Integration with Apple Shortcuts
Miro MCP server, exposing all functionalities available in official Miro SDK.
MCP server for seamless document format conversion using Pandoc, supporting Markdown, HTML, and plain text, with other formats like PDF, csv and docx in development.
Provides direct access to your Bear notes database for comprehensive note management, bypassing standard API limitations.
Turn your Make scenarios into callable tools for AI assistants.