Pharo NeoConsole
Evaluate Pharo Smalltalk expressions and get system information via a local NeoConsole server.
pharo-nc-mcp-server
A local MCP server to evaluate Pharo Smalltalk expressions and get system information via NeoConsole.
Prerequisites
- Python 3.10 or later
- uv package manager
- Pharo with NeoConsole installed
Pharo Setup
- Install Pharo and NeoConsole
- Set the
PHARO_DIRenvironment variable to your Pharo installation directory (default:~/pharo) - Ensure
NeoConsole.imageis available in the Pharo directory
Installation
- Clone the repository:
git clone <repository-url>
cd pharo-nc-mcp-server
- Install dependencies using uv:
uv sync --dev
Usage
Running the MCP Server
Start the server:
uv run pharo-nc-mcp-server
Cursor MCP settings
{
"mcpServers": {
"pharo-nc-mcp-server": {
"command": "uv",
"args": [
"--directory",
"/your-path/to/pharo-nc-mcp-server",
"run",
"pharo-nc-mcp-server"
]
}
}
}
MCP Tools Available
evaluate_smalltalk_with_neo_console
Execute Smalltalk expressions in Pharo using NeoConsole:
# Example usage in MCP client
evaluate_smalltalk_with_neo_console(expression="42 factorial", command="eval")
evaluate_simple_smalltalk
Execute Smalltalk expressions using Pharo's simple -e option:
# Simple evaluation
evaluate_simple_smalltalk(expression="Time now")
get_pharo_metric
Retrieve system metrics from Pharo:
# Get system status
get_pharo_metric(metric="system.status")
# Get memory information
get_pharo_metric(metric="memory.free")
get_class_comment
Get the comment of a Pharo class:
# Get Array class comment
get_class_comment(class_name="Array")
get_class_definition
Get the definition of a Pharo class:
# Get Array class definition
get_class_definition(class_name="Array")
get_method_list
Get the list of method selectors for a Pharo class:
# Get all method selectors for Array class
get_method_list(class_name="Array")
get_method_source
Get the source code of a specific method in a Pharo class:
# Get source code for Array>>asSet method
get_method_source(class_name="Array", selector="asSet")
Environment Variables
PHARO_DIR: Path to Pharo installation directory (default:~/pharo)
Development
Code Formatting and Linting
# Format code
uv run black pharo_nc_mcp_server/
# Lint code
uv run ruff check pharo_nc_mcp_server/
# Run tests
uv run python -m pytest
# Or use the test script
./scripts/test.sh
Development Scripts
The project includes several convenience scripts in the scripts/ directory:
scripts/format.sh
Formats all code and documentation files in one command:
- Formats Python code using Black
- Formats markdown files using mdformat
- Runs linting checks with Ruff
./scripts/format.sh
scripts/test.sh
Runs the test suite using pytest:
./scripts/test.sh
Related Servers
Scout Monitoring MCP
sponsorPut performance and error data directly in the hands of your AI assistant.
Alpha Vantage MCP Server
sponsorAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
ZeroPath MCP Server
Interact with your product security findings using natural language.
CodeAlive MCP
Provides semantic code search and codebase interaction features via the CodeAlive API.
ndlovu-code-reviewer
Manual code reviews are time-consuming and often miss the opportunity to combine static analysis with contextual, human-friendly feedback. This project was created to experiment with MCP tooling that gives AI assistants access to a purpose-built reviewer. Uses the Gemini cli application to process the reviews at this time and linting only for typescript/javascript apps at the moment. Will add API based calls to LLM's in the future and expand linting abilities. It's also cheaper than using coderabbit ;)
PureScript MCP Server
An MCP server offering PureScript development tools for AI assistants. Requires Node.js and the PureScript compiler for full functionality.
MCP Google Apps Script Server
A server for seamless integration with Google Apps Script, enabling automation and extension of Google Workspace applications.
MCPCLIHost
A CLI host that allows Large Language Models (LLMs) to interact with external tools using the Model Context Protocol (MCP).
hanabi-cli
A terminal AI chat interface for any LLM model, with file context, MCP, and deployment support.
Cargo Package README MCP Server
Fetches Rust crate READMEs, metadata, dependencies, and usage information from crates.io.
npm Package README
Fetch READMEs, metadata, and search for packages on the npm registry.
Jetbrains Index Intelligence MCP Plugin
Allows AI-powered coding assistants to tap into your JetBrains IDE’s semantic code index and refactoring engine — giving them true code intelligence (symbol lookup, references, refactors, diagnostics, etc.) via MCP.