MCP Neurolora
An intelligent server for code analysis, collection, and documentation generation using the OpenAI API.
MCP Neurolora
An intelligent MCP server that provides tools for code analysis using OpenAI API, code collection, and documentation generation.
🚀 Installation Guide
Don't worry if you don't have anything installed yet! Just follow these steps or ask your assistant to help you with the installation.
Step 1: Install Node.js
macOS
- Install Homebrew if not installed:
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)" - Install Node.js 18:
brew install node@18 echo 'export PATH="/opt/homebrew/opt/node@18/bin:$PATH"' >> ~/.zshrc source ~/.zshrc
Windows
- Download Node.js 18 LTS from nodejs.org
- Run the installer
- Open a new terminal to apply changes
Linux (Ubuntu/Debian)
curl -fsSL https://deb.nodesource.com/setup_18.x | sudo -E bash -
sudo apt-get install -y nodejs
Step 2: Install uv and uvx
All Operating Systems
-
Install uv:
curl -LsSf https://astral.sh/uv/install.sh | sh -
Install uvx:
uv pip install uvx
Step 3: Verify Installation
Run these commands to verify everything is installed:
node --version # Should show v18.x.x
npm --version # Should show 9.x.x or higher
uv --version # Should show uv installed
uvx --version # Should show uvx installed
Step 4: Configure MCP Server
Your assistant will help you:
-
Find your Cline settings file:
- VSCode:
~/Library/Application Support/Code/User/globalStorage/saoudrizwan.claude-dev/settings/cline_mcp_settings.json - Claude Desktop:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows VSCode:
%APPDATA%/Code/User/globalStorage/saoudrizwan.claude-dev/settings/cline_mcp_settings.json - Windows Claude:
%APPDATA%/Claude/claude_desktop_config.json
- VSCode:
-
Add this configuration:
{ "mcpServers": { "aindreyway-mcp-neurolora": { "command": "npx", "args": ["-y", "@aindreyway/mcp-neurolora@latest"], "env": { "NODE_OPTIONS": "--max-old-space-size=256", "OPENAI_API_KEY": "your_api_key_here" } } } }
Step 5: Install Base Servers
Simply ask your assistant: "Please install the base MCP servers for my environment"
Your assistant will:
- Find your settings file
- Run the install_base_servers tool
- Configure all necessary servers automatically
After the installation is complete:
- Close VSCode completely (Cmd+Q on macOS, Alt+F4 on Windows)
- Reopen VSCode
- The new servers will be ready to use
Important: A complete restart of VSCode is required after installing the base servers for them to be properly initialized.
Note: This server uses
npxfor direct npm package execution, which is optimal for Node.js/TypeScript MCP servers, providing seamless integration with the npm ecosystem and TypeScript tooling.
Base MCP Servers
The following base servers will be automatically installed and configured:
- fetch: Basic HTTP request functionality for accessing web resources
- puppeteer: Browser automation capabilities for web interaction and testing
- sequential-thinking: Advanced problem-solving tools for complex tasks
- github: GitHub integration features for repository management
- git: Git operations support for version control
- shell: Basic shell command execution with common commands:
- ls: List directory contents
- cat: Display file contents
- pwd: Print working directory
- grep: Search text patterns
- wc: Count words, lines, characters
- touch: Create empty files
- find: Search for files
🎯 What Your Assistant Can Do
Ask your assistant to:
- "Analyze my code and suggest improvements"
- "Install base MCP servers for my environment"
- "Collect code from my project directory"
- "Create documentation for my codebase"
- "Generate a markdown file with all my code"
🛠 Available Tools
analyze_code
Analyzes code using OpenAI API and generates detailed feedback with improvement suggestions.
Parameters:
codePath(required): Path to the code file or directory to analyze
Example usage:
{
"codePath": "/path/to/your/code.ts"
}
The tool will:
- Analyze your code using OpenAI API
- Generate detailed feedback with:
- Issues and recommendations
- Best practices violations
- Impact analysis
- Steps to fix
- Create two output files in your project:
- LAST_RESPONSE_OPENAI.txt - Human-readable analysis
- LAST_RESPONSE_OPENAI_GITHUB_FORMAT.json - Structured data for GitHub issues
Note: Requires OpenAI API key in environment configuration
collect_code
Collects all code from a directory into a single markdown file with syntax highlighting and navigation.
Parameters:
directory(required): Directory path to collect code fromoutputPath(optional): Path where to save the output markdown fileignorePatterns(optional): Array of patterns to ignore (similar to .gitignore)
Example usage:
{
"directory": "/path/to/project/src",
"outputPath": "/path/to/project/src/FULL_CODE_SRC_2024-12-20.md",
"ignorePatterns": ["*.log", "temp/", "__pycache__", "*.pyc", ".git"]
}
install_base_servers
Installs base MCP servers to your configuration file.
Parameters:
configPath(required): Path to the MCP settings configuration file
Example usage:
{
"configPath": "/path/to/cline_mcp_settings.json"
}
🔧 Features
The server provides:
-
Code Analysis:
- OpenAI API integration
- Structured feedback
- Best practices recommendations
- GitHub issues generation
-
Code Collection:
- Directory traversal
- Syntax highlighting
- Navigation generation
- Pattern-based filtering
-
Base Server Management:
- Automatic installation
- Configuration handling
- Version management
📄 License
MIT License - feel free to use this in your projects!
👤 Author
Aindreyway
- GitHub: @aindreyway
⭐️ Support
Give a ⭐️ if this project helped you!
Related Servers
PentestGPT-MCP
An advanced penetration testing tool for automated, LLM-driven security assessments using tools like nmap and dirb.
MCP Performance Analysis Server
A server for detecting critical performance issues in code, providing concise analysis and output.
ndlovu-code-reviewer
Manual code reviews are time-consuming and often miss the opportunity to combine static analysis with contextual, human-friendly feedback. This project was created to experiment with MCP tooling that gives AI assistants access to a purpose-built reviewer. Uses the Gemini cli application to process the reviews at this time and linting only for typescript/javascript apps at the moment. Will add API based calls to LLM's in the future and expand linting abilities. It's also cheaper than using coderabbit ;)
Autodev Codebase
A platform-agnostic code analysis library with semantic search capabilities and MCP server support.
UI Prototype
A modern web application prototype built with React, TypeScript, and Material-UI, featuring authentication, internationalization, and Figma integration.
VSCode MCP
Enables AI agents and assistants to interact with Visual Studio Code through the Model Context Protocol.
Swagger/Postman MCP Server
Ingests and serves Swagger/OpenAPI specifications and Postman collections as MCP tools. Requires a config.json for API and authentication setup.
Read Docs MCP
Enables AI agents to access and understand package documentation from local or remote repositories.
Debugg AI
Enable your code gen agents to create & run 0-config end-to-end tests against new code changes in remote browsers via the Debugg AI testing platform.
MalwareBazaar MCP
Interface with Malware Bazaar to get real-time threat intelligence and sample metadata for cybersecurity research.