Studio MCP
Turns any command-line interface (CLI) command into a simple StdIO-based MCP server.
studio-mcp
Convenience alias package for @studio-mcp/studio.
Usage
This package provides a shorter command name for the Studio MCP CLI:
npx studio-mcp [arguments]
Instead of:
npx @studio-mcp/studio [arguments]
What is this?
This is a lightweight alias package that simply wraps and re-exports the @studio-mcp/studio CLI. It exists solely to provide a more convenient command name when using npx.
All functionality, documentation, and updates are handled by the main @studio-mcp/studio package.
Installation
If you want to install globally:
npm install -g studio-mcp
Or use directly with npx (no installation required):
npx studio-mcp
License
MIT
Máy chủ liên quan
Alpha Vantage MCP Server
nhà tài trợAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
Unity Code MCP Server
Powerful tool for the Unity Editor that gives AI Agents ability to perform any action using Unity Editor API, like modification of scripts, scenes, prefabs, assets, configuration and more.
claude-token-analyzer
Diagnoses token waste in Claude Code sessions with 6 anomaly types and severity scoring. Fully local.
Figma
Interact with Figma files to view, comment on, and analyze designs.
.NET Types Explorer
Provides detailed type information from .NET projects including assembly exploration, type reflection, and NuGet integration for AI coding agents
MCP CLI
A command-line interface for interacting with Model Context Protocol servers.
Logfire
Provides access to OpenTelemetry traces and metrics through Logfire.
MCPBundles
One remote MCP server for 500+ production APIs — Stripe, HubSpot, Postgres, Gmail, and more. OAuth and API key auth, credential management, and a CLI.
GenSpec MCP Server
Converts a USER-STORIES.md file into README, ROADMAP, and SYSTEM-ARCHITECTURE documents for the GenSpec workflow.
RubyGems
Fetch metadata for Ruby gems from the rubygems.org API.
PromptOT
Manage, version, and publish LLM prompts via PromptOT. Build structured prompts from blocks, add variables, save drafts, publish versions, and deliver via API — all from chat.