Fabi Analyst Agent MCP
Fabi MCP is an autonomous agent that handles end-to-end data analysis tasks from natural language requests, automatically discovering data schemas, generating sql or python code, executing queries, and presenting insights.
Advanced features & dev tools
MCP Server
Overview
Fabi.ai provides an MCP (Model Context Protocol) server that allows you to integrate Fabi’s AI data analysis capabilities directly into your development workflow or your client/interface of choice. The MCP server enables AI assistants and development tools to interact with Fabi.ai, creating threads, submitting chat requests, and saving Smartbooks programmatically. The Fabi MCP server is the single fastest way for you to implement an AI assistant to chat directly with your data.
Authentication
The Fabi MCP server supports two authentication methods:
Token authentication
Token authentication is the recommended method for programmatic access. You can generate MCP tokens from your Fabi.ai settings:
- Navigate to https://app.fabi.ai/settings/mcp
- Generate a new MCP token
- Copy the token securely - it will only be shown once
OAuth authentication
OAuth authentication is also supported for user-based integrations. Follow the OAuth flow to authenticate your application and use the following URL: https://app.fabi.ai/mcp
Configuration
To connect to the Fabi MCP server, add the following configuration to your MCP client settings:
{
"mcpServers": {
"fabi": {
"command": "npx",
"args": [
"mcp-remote",
"https://app.fabi.ai/mcp",
"--header",
"Authorization: Bearer <your-fabi-mcp-token>"
]
}
}
}
Replace <your-fabi-mcp-token> with the token you generated from the settings page.
Available tools
The Fabi MCP server provides the following tools for interacting with Fabi.ai:
Create a thread
Creates a new data analysis session (backed by a Smartbook) for SQL/Python queries. Used to start analyzing database tables, running queries, or exploring data. The thread will persist your analysis history and generated code. Parameters:
title(optional): Title for the analysis session Returns: Thread UUID for subsequent operations
Submit chat
Delegate a data analysis task to the Fabi autonomous agent. Takes a description of what you want in natural language - Fabi will independently handle all complexity: discovering data sources, using RAG to find relevant table/column schemas and semantics, generating SQL/Python code, validating queries with dry runs, executing them, and formatting results. Parameters:
thread_uuid: UUID of the thread fromcreate_threadmessage: Natural language data analysis request (e.g., ‘show top 10 customers by revenue’)context_cell_uuids(optional): Previous cell UUIDs to reference in this analysiscontext_dataframes(optional): Variable names of dataframes to use as context Returns: Request UUID and initial status
Get chat result
Poll for the result of a long-running chat analysis request. Used by the agent when submit_chat returns early due to timeout (after 45 seconds). The chat continues processing in the background - call this periodically to check if results are ready. Parameters:
request_uuid: UUID of the chat request (returned bysubmit_chat) Returns: Processing status or completed results with data preview
Save to Smartbook
Save AI-generated cells from chat history to the Smartbook for dashboard publishing or collaboration. Used by the agent to persist, publish, or share the analysis as a dashboard. This accepts pending chat results and converts them into executable Smartbook cells. Regular analyses are already viewable in chat history and don’t need saving. Parameters:
thread_uuid: UUID of the thread to save Returns: Confirmation of saved Smartbook cells
Usage examples
Basic workflow
- Create a thread to start a new analysis session
- Submit chat requests with natural language queries
- Get chat results to retrieve analysis outputs
- Save to Smartbook (optional) to persist results for sharing
Example: Analyzing sales data
// 1. Create a new thread
const thread = await mcp.callTool('fabi', 'create_thread', {
title: 'Sales Analysis Q4 2024'
});
// 2. Submit an analysis request
const request = await mcp.callTool('fabi', 'submit_chat', {
thread_uuid: thread.uuid,
message: 'Show top 10 customers by revenue in Q4 2024'
});
// 3. Get the results
const results = await mcp.callTool('fabi', 'get_chat_result', {
request_uuid: request.uuid
});
// 4. Save to Smartbook for sharing
await mcp.callTool('fabi', 'save_to_smartbook', {
thread_uuid: thread.uuid
});
Troubleshooting
- First try using the AI Analyst Agent in the Fabi UI to ensure it works as expected. If it does then
- If using token-based authentication, make sure you’re using a valid token
- If using a local agent, restart the agent after configuring the tools
Best practices
Token security
- Store MCP tokens securely and never commit them to version control
- Use environment variables or secure secret management
- Rotate tokens regularly for enhanced security
- Each token should only be used by one application or user
Error handling
- Implement retry logic for
get_chat_resultwhen status is “processing” - Handle timeout scenarios gracefully
- Validate thread UUIDs before making subsequent calls
Performance
- Use
context_cell_uuidsandcontext_dataframesto build on previous analyses - Only call
save_to_smartbookwhen you need to persist results - Batch related queries in the same thread for better context
Git version controlPersonal AI configuration
Related Servers
Salesforce MCP Server
Enables natural language interaction with Salesforce data. Query, modify, and manage Salesforce objects and records.
Firebolt
Connect your LLM to the Firebolt Data Warehouse for data querying and analysis.
TiDB
An MCP server for TiDB, a serverless, distributed SQL database.
FrankfurterMCP
MCP server acting as an interface to the Frankfurter API for currency exchange data.
InstantDB
Create, manage, and update applications on InstantDB, the modern Firebase.
Snowflake Stored Procedure Integration
Integrates and executes Snowflake stored procedures through an MCP server.
CData ActiveCampaign Server
Access and manage ActiveCampaign data through the CData JDBC Driver.
CData API Driver MCP Server
A read-only MCP server for querying live data from various APIs using the CData JDBC Driver for API Driver.
PocketBase MCP Server
Interact with a PocketBase instance to manage records and files in collections.
MCP BigQuery Server
Securely access BigQuery datasets with intelligent caching, schema tracking, and query analytics via Supabase integration.