Netmind Code Interpreter
Execute code using the Netmind API.
Netmind Code Interpreter
The Code Interpreter AI service, built and customized by the NetMind team, is a high-quality, robust, and cost-efficient solution for executing code in 14+ programming languages through a secure cloud environment. It is fully MCP server–ready, allowing seamless integration with AI agents.
Components
Tools
- execute_code: Execute code in the specified programming language with secure cloud execution.
- language: required: Programming language identifier (string). Supported values: 'python', 'javascript', 'typescript', 'java', 'cpp', 'c', 'go', 'rust', 'php', 'ruby', 'swift', 'kotlin', 'scala', 'r'
- files: required: List of code files to execute. Each file must have 'name' (filename with extension) and 'content' (complete source code as string)
- stdin: optional: Standard input to provide to the program during execution (default: empty string)
- args: optional: Command line arguments to pass to the program (default: empty list)
- Returns execution results with stdout, stderr, and data outputs on success, or error description on failure.
Installation
Requires UV (Fast Python package and project manager)
If uv isn't installed.
# Using Homebrew on macOS
brew install uv
or
# On macOS and Linux.
curl -LsSf https://astral.sh/uv/install.sh | sh
# On Windows.
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
Environment Variables
You can obtain an API key from Netmind
NETMIND_API_TOKEN: Your Netmind API key
Cursor & Claude Desktop && Windsurf Installation
Add this tool as a mcp server by editing the Cursor/Claude/Windsurf config file.
{
"mcpServers": {
"code-interpreter": {
"env": {
"NETMIND_API_TOKEN": "XXXXXXXXXXXXXXXXXXXX"
},
"command": "uvx",
"args": [
"netmind-code-interpreter-mcp"
]
}
}
}
Features
Core Features of the NetMind Code Interpreter API We provide a robust feature set that handles the complexity of code execution, so you can focus on building what matters.
- Unified Multi-Language Support: Execute code in the language you need, without configuration changes. Our environment provides native support for Python, JavaScript, Java, Go, C++, Rust, Ruby, PHP, and more, each running in an optimized and consistent runtime.
- Seamless File Management & I/O: Effortlessly integrate your data. Upload your scripts, CSVs, and other text-based files directly to the runtime. Your code can read, write, and process these files in a fully sandboxed filesystem for complete security and data integrity.
- Automatic Image & Artifact Generation: Turn data into visuals, automatically. Code that generates plots, charts, or diagrams will have the output automatically captured and delivered to you as a secure link. Get your PNGs, JPEGs, and other results without any extra steps.
- Enterprise-Grade Security and Performance: Execute any code with complete peace of mind. Every execution runs in a disposable, fully isolated container with reasonable resource limits on memory, CPU, and time. With rapid container startup and massive concurrency support, our platform is built for production-scale performance and security.
Supported Languages
- Python
- JavaScript/TypeScript
- Java
- C/C++
- Go
- Rust
- PHP
- Ruby
- Swift
- Kotlin
- Scala
Configuration
Set your Netmind API token as an environment variable:
export NETMIND_API_TOKEN="your-api-token-here"
Usage
Starting the Server
# using Python module
python -m netmind_code_interpreter.server
Testing the Server
Run the test client to verify everything is working:
python tests/test_client.py
MCP Client Integration
The server can be used with any MCP-compatible client. Here's how to configure it:
- Start the server on your desired port (default: 8000)
- Configure your MCP client to connect to
http://localhost:8000/sse - Use the
execute_codetool to run code
API Reference
execute_code Tool
Execute code in the specified programming language.
Parameters:
language(str): Programming language identifierfiles(List[Dict]): List of code files to executename(str): Filename with appropriate extensioncontent(str): Complete source code as a string
stdin(str, optional): Standard input contentargs(List[str], optional): Command line arguments
Returns:
- Success:
{"run": {"stdout": "...", "stderr": "...", "data": [...]}} - Error:
{"error": "error description", "run": {"stdout": "", "stderr": "..."}}
Example:
result = await client.call_tool("execute_code", {
"language": "python",
"files": [
{
"name": "hello.py",
"content": "print('Hello, World!')"
}
]
})
License
MIT License - see LICENSE file for details.
関連サーバー
Scout Monitoring MCP
スポンサーPut performance and error data directly in the hands of your AI assistant.
Alpha Vantage MCP Server
スポンサーAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
MCP Tree-sitter Server
A server for code analysis using Tree-sitter, with context management capabilities.
Sherlog MCP Server
A persistent IPython workspace for data analysis, log processing, and multi-agent collaboration.
OpenAPI Schema
Exposes OpenAPI schema information to Large Language Models (LLMs). The server loads OpenAPI schema files specified via command line.
EdgeOne Pages MCP
An MCP server implementation using EdgeOne Pages Functions for intelligent chat applications.
Vibe-Coder
A structured development workflow for LLM-based coding, including feature clarification, planning, phased development, and progress tracking.
Chrome DevTools MCP Server
An MCP server for AI-assisted frontend development using Chrome DevTools. Requires Google Chrome.
Brainfaq
MCP server for the Brainfuck programming language that allows your favourite LLM to debug Brainfuck programs.
MCP Gateway
Integrates multiple MCP servers into a single interface with a management Web UI and real-time status updates.
Remote MCP Server on Cloudflare
An example of a remote MCP server deployable on Cloudflare Workers, without authentication.
Praison AI
AI Agents framework with 64+ built-in MCP tools for search, memory, workflows, code execution, and file operations. Install via `uvx praisonai-mcp`