Evaluate Pharo Smalltalk expressions and get system information via a local NeoConsole server.
A local MCP server to evaluate Pharo Smalltalk expressions and get system information via NeoConsole.
PHARO_DIR
environment variable to your Pharo installation directory (default: ~/pharo
)NeoConsole.image
is available in the Pharo directorygit clone <repository-url>
cd pharo-nc-mcp-server
uv sync --dev
Start the server:
uv run pharo-nc-mcp-server
{
"mcpServers": {
"pharo-nc-mcp-server": {
"command": "uv",
"args": [
"--directory",
"/your-path/to/pharo-nc-mcp-server",
"run",
"pharo-nc-mcp-server"
]
}
}
}
evaluate_smalltalk_with_neo_console
Execute Smalltalk expressions in Pharo using NeoConsole:
# Example usage in MCP client
evaluate_smalltalk_with_neo_console(expression="42 factorial", command="eval")
evaluate_simple_smalltalk
Execute Smalltalk expressions using Pharo's simple -e option:
# Simple evaluation
evaluate_simple_smalltalk(expression="Time now")
get_pharo_metric
Retrieve system metrics from Pharo:
# Get system status
get_pharo_metric(metric="system.status")
# Get memory information
get_pharo_metric(metric="memory.free")
get_class_comment
Get the comment of a Pharo class:
# Get Array class comment
get_class_comment(class_name="Array")
get_class_definition
Get the definition of a Pharo class:
# Get Array class definition
get_class_definition(class_name="Array")
get_method_list
Get the list of method selectors for a Pharo class:
# Get all method selectors for Array class
get_method_list(class_name="Array")
get_method_source
Get the source code of a specific method in a Pharo class:
# Get source code for Array>>asSet method
get_method_source(class_name="Array", selector="asSet")
PHARO_DIR
: Path to Pharo installation directory (default: ~/pharo
)# Format code
uv run black pharo_nc_mcp_server/
# Lint code
uv run ruff check pharo_nc_mcp_server/
# Run tests
uv run python -m pytest
# Or use the test script
./scripts/test.sh
The project includes several convenience scripts in the scripts/
directory:
scripts/format.sh
Formats all code and documentation files in one command:
./scripts/format.sh
scripts/test.sh
Runs the test suite using pytest:
./scripts/test.sh
A unified interface for various Large Language Model (LLM) providers, including OpenAI, Anthropic, Google Gemini, Groq, DeepSeek, and Ollama.
A JSON diff tool to compare two JSON strings.
A template for building Model Context Protocol (MCP) servers using the mcp-framework for Node.js.
Integrates with Google AI Studio/Gemini API for PDF to Markdown conversion and content generation.
Client implementation for Mastra, providing seamless integration with MCP-compatible AI models and tools.
Advanced evaluation tools for AI safety, alignment, and performance using the Trustwise API.
Use command line tools in a secure fashion as MCP tools.
Advanced computer vision and object detection MCP server powered by Dino-X, enabling AI agents to analyze images, detect objects, identify keypoints, and perform visual understanding tasks.
Interact with various build systems including Gradle, Maven, NPM/Yarn, Cargo, Python, Makefile, and CMake.
An MCP client for Cursor that uses OpenRouter.ai to access multiple AI models. Requires an OpenRouter API key.