LangSmith MCP Server

An MCP server for fetching conversation history and prompts from the LangSmith observability platform.

๐Ÿฆœ๐Ÿ› ๏ธ LangSmith MCP Server

[!WARNING] LangSmith MCP Server is under active development and many features are not yet implemented.

LangSmith MCP Hero

License: MIT Python 3.10

A production-ready Model Context Protocol (MCP) server that provides seamless integration with the LangSmith observability platform. This server enables language models to fetch conversation history and prompts from LangSmith.

๐Ÿ“‹ Overview

The LangSmith MCP Server bridges the gap between language models and the LangSmith platform, enabling advanced capabilities for conversation tracking, prompt management, and analytics integration.

๐Ÿ› ๏ธ Installation Options

๐Ÿ“ General Prerequisites

  1. Install uv (a fast Python package installer and resolver):

    curl -LsSf https://astral.sh/uv/install.sh | sh
    
  2. Clone this repository and navigate to the project directory:

    git clone https://github.com/langchain-ai/langsmith-mcp-server.git
    cd langsmith-mcp-server
    

๐Ÿ”Œ MCP Client Integration

Once you have the LangSmith MCP Server, you can integrate it with various MCP-compatible clients. You have two installation options:

๐Ÿ“ฆ From PyPI

  1. Install the package:

    uv run pip install --upgrade langsmith-mcp-server
    
  2. Add to your client MCP config:

    {
        "mcpServers": {
            "LangSmith API MCP Server": {
                "command": "/path/to/uvx",
                "args": [
                    "langsmith-mcp-server"
                ],
                "env": {
                    "LANGSMITH_API_KEY": "your_langsmith_api_key",
                    "LANGSMITH_WORKSPACE_ID": "your_workspace_id",
                    "LANGSMITH_ENDPOINT": "https://api.smith.langchain.com"
                }
            }
        }
    }
    

โš™๏ธ From Source

Add the following configuration to your MCP client settings:

{
    "mcpServers": {
        "LangSmith API MCP Server": {
            "command": "/path/to/uv",
            "args": [
                "--directory",
                "/path/to/langsmith-mcp-server/langsmith_mcp_server",
                "run",
                "server.py"
            ],
            "env": {
                "LANGSMITH_API_KEY": "your_langsmith_api_key",
                "LANGSMITH_WORKSPACE_ID": "your_workspace_id",
                "LANGSMITH_ENDPOINT": "https://api.smith.langchain.com"
            }
        }
    }
}

Replace the following placeholders:

  • /path/to/uv: The absolute path to your uv installation (e.g., /Users/username/.local/bin/uv). You can find it running which uv.
  • /path/to/langsmith-mcp-server: The absolute path to your langsmith-mcp project directory
  • your_langsmith_api_key: Your LangSmith API key (required)
  • your_workspace_id: Your LangSmith workspace ID (optional, for API keys scoped to multiple workspaces)
  • https://api.smith.langchain.com: The LangSmith API endpoint (optional, defaults to the standard endpoint)

Example configuration:

{
    "mcpServers": {
        "LangSmith API MCP Server": {
            "command": "/Users/mperini/.local/bin/uvx",
            "args": [
                "langsmith-mcp-server"
            ],
            "env": {
                "LANGSMITH_API_KEY": "lsv2_pt_1234",
                "LANGSMITH_WORKSPACE_ID": "your_workspace_id",
                "LANGSMITH_ENDPOINT": "https://api.smith.langchain.com"
            }
        }
    }
}

Copy this configuration in Cursor > MCP Settings.

LangSmith Cursor Integration

๐Ÿ”ง Environment Variables

The LangSmith MCP Server supports the following environment variables:

VariableRequiredDescriptionExample
LANGSMITH_API_KEYโœ… YesYour LangSmith API key for authenticationlsv2_pt_1234567890
LANGSMITH_WORKSPACE_IDโŒ NoWorkspace ID for API keys scoped to multiple workspacesyour_workspace_id
LANGSMITH_ENDPOINTโŒ NoCustom API endpoint URL (for self-hosted or EU region)https://api.smith.langchain.com

Notes:

  • Only LANGSMITH_API_KEY is required for basic functionality
  • LANGSMITH_WORKSPACE_ID is useful when your API key has access to multiple workspaces
  • LANGSMITH_ENDPOINT allows you to use custom endpoints for self-hosted LangSmith installations or the EU region

๐Ÿงช Development and Contributing ๐Ÿค

If you want to develop or contribute to the LangSmith MCP Server, follow these steps:

  1. Create a virtual environment and install dependencies:

    uv sync
    
  2. To include test dependencies:

    uv sync --group test
    
  3. View available MCP commands:

    uvx langsmith-mcp-server
    
  4. For development, run the MCP inspector:

    uv run mcp dev langsmith_mcp_server/server.py
    
    • This will start the MCP inspector on a network port
    • Install any required libraries when prompted
    • The MCP inspector will be available in your browser
    • Set the LANGSMITH_API_KEY environment variable in the inspector
    • Connect to the server
    • Navigate to the "Tools" tab to see all available tools
  5. Before submitting your changes, run the linting and formatting checks:

    make lint
    make format
    

๐Ÿš€ Example Use Cases

The server enables powerful capabilities including:

  • ๐Ÿ’ฌ Conversation History: "Fetch the history of my conversation with the AI assistant from thread 'thread-123' in project 'my-chatbot'"
  • ๐Ÿ“š Prompt Management: "Get all public prompts in my workspace"
  • ๐Ÿ” Smart Search: "Find private prompts containing the word 'joke'"
  • ๐Ÿ“ Template Access: "Pull the template for the 'legal-case-summarizer' prompt"
  • ๐Ÿ”ง Configuration: "Get the system message from a specific prompt template"

๐Ÿ› ๏ธ Available Tools

The LangSmith MCP Server provides the following tools for integration with LangSmith:

Tool NameDescription
list_promptsFetch prompts from LangSmith with optional filtering. Filter by visibility (public/private) and limit results.
get_prompt_by_nameGet a specific prompt by its exact name, returning the prompt details and template.
get_thread_historyRetrieve the message history for a specific conversation thread, returning messages in chronological order.
get_project_runs_statsGet statistics about runs in a LangSmith project, either for the last run or overall project stats.
fetch_traceFetch trace content for debugging and analyzing LangSmith runs using project name or trace ID.
list_datasetsFetch LangSmith datasets with filtering options by ID, type, name, or metadata.
list_examplesFetch examples from a LangSmith dataset with advanced filtering options.
read_datasetRead a specific dataset from LangSmith using dataset ID or name.
read_exampleRead a specific example from LangSmith using the example ID and optional version information.

๐Ÿ“„ License

This project is distributed under the MIT License. For detailed terms and conditions, please refer to the LICENSE file.

Made with โค๏ธ by the LangChain Team

Related Servers