Matter AI
Provides advanced code review, implementation planning, and pull request generation using Matter AI.
Matter AI MCP Server
MatterAI MCP offers code reviews right in your IDE when using AI Agents such as in Cursor, Windsurf, VS Code, Cline and more to enhances your development workflow. Built with FastMCP in Python, it provides advanced code review capabilities, implementation planning, and pull request generation to help you release code with confidence.
Features
- Code review tools - Get comprehensive code reviews for individual files or full git diffs
- Implementation planning - Generate detailed implementation plans for AI agents
- Pull request generation - Create pull requests with auto-generated titles and descriptions
- Random cat facts - Because who doesn't love cat facts?
Requirements
- Python 3.11+
- See
requirements.txt
for dependencies
Installation
pip install -r requirements.txt
Setup
API Key
To use Matter AI MCP Server, you need an API key:
- Obtain your API key from https://app.matterai.dev/settings
- Use this key in your MCP configuration as shown below
MCP Configuration
Create an MCP configuration file with the following content:
{
"mcpServers": {
"matter-ai": {
"command": "npx",
"args": [
"-y",
"mcp-remote",
"https://mcp.matterai.so/sse",
"--header",
"X-AUTH-TOKEN:MATTER_AI_API_KEY"
]
}
}
}
Replace MATTER_AI_API_KEY
with your actual API key.
Usage
Run the server:
python server.py
The server will start on http://localhost:9000
(default for FastMCP).
Connecting from Cursor or Windsurf
- Use the MCP (Model Context Protocol) integration
- Point to:
http://localhost:9000/sse
- Tools will auto-discover and appear in the client
Tools
1. Code Review
codereview(generated_code: str, git_owner: str, git_repo: str, git_branch: str, git_user: str, languages: str) -> str
Provides code review for the generated code.
2. Full Code Review
codereview_full(git_diff: str, git_owner: str, git_repo: str, git_branch: str, git_user: str) -> str
Provides a comprehensive code review based on git diff output.
1. Cat Fact
cat_fact() -> str
Returns a random cat fact.
Docker Build and Use
Building the Docker Image
docker build -t matter-ai-mcp .
Running the Docker Container
docker run -p 9000:9000 -e MATTER_API_ENDPOINT=https://api.matterai.so
The server will be accessible at http://localhost:9000
.
License
MIT
Resources:
- Website: https://matterai.so
- Docs: https://docs.matterai.so
Related Servers
Cucumber Studio
Provides LLM access to the Cucumber Studio testing platform for managing and executing tests.
Gemini CLI
Integrates with the unofficial Google Gemini CLI, allowing file access within configured directories.
PyMOL-MCP
Enables conversational structural biology, molecular visualization, and analysis in PyMOL through natural language.
uMCP (ultraMCP)
A lightweight Java framework for building MCP servers with TCP transport via mcp-java-bridge.
Local Context MCP
A collection of reference implementations for the Model Context Protocol (MCP), giving LLMs secure access to tools and data.
MCP Montano Server
A general-purpose server project built with TypeScript.
DALL-E Image Generator
Generate images using OpenAI's DALL-E API.
EdgeOne Pages MCP
An MCP server and client implementation for EdgeOne Pages Functions, supporting OpenAI-formatted requests.
Ollama MCP Server
Integrate local Ollama LLM instances with MCP-compatible applications.
Context Portal MCP (ConPort)
A server for managing structured project context using SQLite, with support for vector embeddings for semantic search and Retrieval Augmented Generation (RAG).