Atla
Enable AI agents to interact with the Atla API for state-of-the-art LLMJ evaluation.
Atla MCP Server
[!CAUTION] This repository was archived on July 21, 2025. The Atla API is no longer active.
An MCP server implementation providing a standardized interface for LLMs to interact with the Atla API for state-of-the-art LLMJ evaluation.
Learn more about Atla here. Learn more about the Model Context Protocol here.
Available Tools
evaluate_llm_response: Evaluate an LLM's response to a prompt using a given evaluation criteria. This function uses an Atla evaluation model under the hood to return a dictionary containing a score for the model's response and a textual critique containing feedback on the model's response.evaluate_llm_response_on_multiple_criteria: Evaluate an LLM's response to a prompt across multiple evaluation criteria. This function uses an Atla evaluation model under the hood to return a list of dictionaries, each containing an evaluation score and critique for a given criteria.
Usage
To use the MCP server, you will need an Atla API key. You can find your existing API key here or create a new one here.
Installation
We recommend using
uvto manage the Python environment. See here for installation instructions.
Manually running the server
Once you have uv installed and have your Atla API key, you can manually run the MCP server using uvx (which is provided by uv):
ATLA_API_KEY=<your-api-key> uvx atla-mcp-server
Connecting to the server
Having issues or need help connecting to another client? Feel free to open an issue or contact us!
OpenAI Agents SDK
For more details on using the OpenAI Agents SDK with MCP servers, refer to the official documentation.
- Install the OpenAI Agents SDK:
pip install openai-agents
- Use the OpenAI Agents SDK to connect to the server:
import os
from agents import Agent
from agents.mcp import MCPServerStdio
async with MCPServerStdio(
params={
"command": "uvx",
"args": ["atla-mcp-server"],
"env": {"ATLA_API_KEY": os.environ.get("ATLA_API_KEY")}
}
) as atla_mcp_server:
...
Claude Desktop
For more details on configuring MCP servers in Claude Desktop, refer to the official MCP quickstart guide.
- Add the following to your
claude_desktop_config.jsonfile:
{
"mcpServers": {
"atla-mcp-server": {
"command": "uvx",
"args": ["atla-mcp-server"],
"env": {
"ATLA_API_KEY": "<your-atla-api-key>"
}
}
}
}
- Restart Claude Desktop to apply the changes.
You should now see options from atla-mcp-server in the list of available MCP tools.
Cursor
For more details on configuring MCP servers in Cursor, refer to the official documentation.
- Add the following to your
.cursor/mcp.jsonfile:
{
"mcpServers": {
"atla-mcp-server": {
"command": "uvx",
"args": ["atla-mcp-server"],
"env": {
"ATLA_API_KEY": "<your-atla-api-key>"
}
}
}
}
You should now see atla-mcp-server in the list of available MCP servers.
Contributing
Contributions are welcome! Please see the CONTRIBUTING.md file for details.
License
This project is licensed under the MIT License. See the LICENSE file for details.
Related Servers
Scout Monitoring MCP
sponsorPut performance and error data directly in the hands of your AI assistant.
Alpha Vantage MCP Server
sponsorAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
Change8
Breaking Change Alerts for Humans and AI Agents.
s&box MCP Server
Enables AI assistants to interact with s&box game objects and components via WebSocket communication.
BlenderMCP
Connects Blender to Claude AI via the Model Context Protocol (MCP), enabling direct interaction and control for prompt-assisted 3D modeling, scene creation, and manipulation.
Burp Suite
Integrate Burp Suite with AI clients using the Model Context Protocol (MCP).
MCP Rules Enforcer Zero
An MCP server that enforces rules from markdown files for AI agents. This is a zero-tool version that requires an external rules file.
GameCode MCP2
A Model Context Protocol (MCP) server for tool integration, configured using a tools.yaml file.
AST2LLM for Go
An AST-powered tool that enhances LLM context by automatically injecting relevant Go code structures into prompts.
Core Lightning MCP Server
A Rust-based gRPC server that provides a standardized MCP interface for Core Lightning nodes.
Background Process MCP
A server that provides background process management capabilities, enabling LLMs to start, stop, and monitor long-running command-line processes.
Semgrep
Enable AI agents to secure code with Semgrep.