Tabby-MCP-Server
A Tabby plugin implementing an MCP server for AI-powered terminal control and automation.
๐ Tabby-MCP-Server
Powerful Tabby plugin that implements Model Context Protocol (MCP) server, enabling AI-powered terminal control and automation.
Plugin for Tabby Terminal - a modern, highly configurable terminal emulator.
๐น Video Demo
Watch the full video demonstration of Tabby-MCP in action:
โจ Features
- ๐ค AI Connection: Seamlessly connect AI assistants to your terminal
- ๐ MCP Server: Built-in Model Context Protocol server implementation
- ๐ฅ๏ธ Terminal Control: Allow AI to execute commands and read terminal output
- ๐ Session Management: View and manage SSH sessions
- ๐ซ Command Abort: Safely abort running commands
- ๐ Buffer Access: Retrieve terminal buffer content with flexible options
- ๐ Pair Programming Mode: Optional confirmation dialog before command execution
- ๐ Command History: Track and review previously executed commands
- ๐ Command Output Storage: Paginated access to complete command outputs
๐ Table of Contents
- ๐ Tabby-MCP-Server
๐ง Installation
Install from Tabby Plugin Store
- Go to Tabby settings โ Plugins โ MCP
- Click "Install" on the Tabby MCP plugin
- Restart Tabby
- Configure your AI client to connect to the MCP server (see Connecting to MCP)
Using Docker
You can build and install the plugin using Docker with the following command:
git clone https://github.com/thuanpham582002/tabby-mcp-server.git
cd tabby-mcp-server
# Build the Docker image
docker build -t tabby-mcp . && docker run -v $(pwd)/build:/output tabby-mcp
bash scripts/copy_to_plugin_folder.sh
This command builds a Docker image tagged as 'tabby-mcp' and runs a container from this image, mounting your local 'build' directory to '/output' in the container. The script scripts/copy_to_plugin_folder.sh
will copy the built files to the Tabby plugin folder.
๐ Quick Start
- Install the plugin using one of the methods above
- Start Tabby and navigate to Settings โ Plugins โ MCP
- Configure the MCP server port (default: 3001)
- Toggle "Start on Boot" to automatically start the server when Tabby launches
- Connect to the MCP server from any supported AI client listed at https://modelcontextprotocol.io/clients
๐ป Usage Examples
Connect an AI to Control Your Terminal
- Start Tabby with the MCP plugin enabled
- Configure your AI client to connect to the MCP server (see Connecting to MCP)
- Ask your AI assistant to run commands or manage your terminal sessions
Example AI prompt:
Connect to my Tabby MCP server and list all available terminal sessions.
Then execute the command "ls -la" in the first available terminal.
๐ Connecting to MCP
To configure AI clients to use your MCP server, add the following to your ~/.cursor/mcp.json
file:
STDIO mode:
{
"mcpServers": {
"Tabby MCP": {
"command": "npx",
"args": [
"-y",
"tabby-mcp-stdio",
"--port",
"3001"
]
}
}
}
SSE mode:
{
"mcpServers": {
"Tabby MCP": {
"type": "sse",
"url": "http://localhost:3001/sse"
}
}
}
Select your preferred MCP server in your AI client settings. The Tabby MCP plugin must be running for the "Tabby MCP" (SSE) option to work, while the STDIO and Docker options will start their own server instances.
โ๏ธ Configuration in Tabby Setting
Configure the MCP server through the Tabby settings:
{
"mcp": {
"port": 3001,
"host": "http://localhost:3001",
"enableLogging": false,
"startOnBoot": true,
"pairProgrammingMode": {
"enabled": true,
"showConfirmationDialog": true,
"autoFocusTerminal": true
}
}
}
Pair Programming Mode
The plugin includes a "Pair Programming Mode" that adds safety features when AI assistants control your terminal:
- Confirmation Dialog: Prompt user before executing commands
- Auto Focus Terminal: Automatically focus terminal when commands are executed
- Command Rejection: Ability to reject commands with feedback
To enable Pair Programming Mode:
- Go to Tabby settings โ Plugins โ MCP
- Toggle "Enable Pair Programming Mode"
- Configure additional safety options as needed
๐ API Reference
Available Tools
Tool | Description | Parameters |
---|---|---|
get_ssh_session_list | Get list of all terminal sessions | None |
exec_command | Execute a command in terminal | command , tabId , commandExplanation |
get_terminal_buffer | Get terminal content | tabId , startLine , endLine |
get_command_output | Retrieve complete command output | outputId , startLine , maxLines |
๐ค Contributing
Contributions are welcome! Here's how you can help:
- Fork the repository
- Create a feature branch (
git checkout -b feature/your-feature
) - Commit your changes (
git commit -m 'Add your feature'
) - Push to the branch (
git push origin feature/your-feature
) - Open a Pull Request
See the contributing guidelines for more details.
Development Workflow
-
Clone the repository and install dependencies:
git clone https://github.com/thuanpham582002/tabby-mcp-server.git cd tabby-mcp-server npm install
-
Make your changes to the codebase
-
Build the plugin:
docker build -t tabby-mcp . && docker run -v $(pwd)/build:/output tabby-mcp
-
Test the plugin with Tabby:
bash scripts/copy_to_plugin_folder.sh
๐ License
This project is licensed under the MIT License - see the LICENSE file for details.
Related Servers
Galley MCP Server
Integrates Galley's GraphQL API with MCP clients. It automatically introspects the GraphQL schema for seamless use with tools like Claude and VS Code.
MCP่ฟญไปฃ็ฎก็ๅทฅๅ ท
An iteration management tool to automate the collection and submission of iteration information to a CodeReview system.
MCP Server Creator
A meta-server for dynamically generating MCP server configurations and Python code.
OpenRouter MCP Client for Cursor
An MCP client for Cursor that uses OpenRouter.ai to access multiple AI models. Requires an OpenRouter API key.
MCP Bridge API
A lightweight, LLM-agnostic RESTful proxy that unifies multiple MCP servers under a single API.
Stata-MCP
Perform regression analysis using Stata with the help of an LLM. Requires a local Stata installation and an external LLM API key.
EOL MCP Server
Check software end-of-life (EOL) dates and support status using the endoflife.date API to provide accurate lifecycle and security information.
MCP Prompt Collector
Tools for logging, analyzing, and improving Claude Desktop prompts to enhance prompt engineering skills.
ComfyUI MCP Server
Integrates ComfyUI with MCP, allowing the use of custom workflows. Requires a running ComfyUI server.
Second Opinion MCP Server
An AI-powered coding assistant that combines insights from Gemini, Stack Overflow, and Perplexity AI to help solve programming problems.