Figma MCP Server
An MCP server for interacting with the Figma API. Manage files, comments, components, projects, and more.
Figma MCP Server
A Model Context Protocol (MCP) server that provides integration with Figma's API, allowing you to interact with Figma files, comments, components, and more.
Features
-
File Operations
- Get file information
- Get file version history
- Get file components
-
Comment Management
- List comments in files
- Add new comments
- Delete comments
-
Project & Team Features
- List team projects
- Get project files
- Get published styles
-
Webhook Management
- Create webhooks
- List existing webhooks
- Delete webhooks
Installation
Installing via Smithery
To install Figma MCP Server for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @deepsuthar496/figma-mcp-server --client claude
Manual Installation
- Clone the repository
- Install dependencies:
npm install
- Build the server:
npm run build
Configuration
Configure the server in your MCP settings file with your Figma access token:
{
"mcpServers": {
"figma": {
"command": "node",
"args": ["path/to/figma-server/build/index.js"],
"env": {
"FIGMA_ACCESS_TOKEN": "your-access-token-here"
},
"disabled": false,
"alwaysAllow": []
}
}
}
Available Tools
File Operations
get_file
Get information about a Figma file
{
"file_key": "string"
}
get_file_versions
Get version history of a file
{
"file_key": "string"
}
get_file_components
Get components in a file
{
"file_key": "string"
}
Comment Management
get_file_comments
Get comments from a file
{
"file_key": "string"
}
post_comment
Post a comment to a file
{
"file_key": "string",
"message": "string"
}
delete_comment
Delete a comment from a file
{
"file_key": "string",
"comment_id": "string"
}
Project & Team Operations
get_team_projects
Get projects for a team
{
"team_id": "string"
}
get_project_files
Get files in a project
{
"project_id": "string"
}
get_component_styles
Get published styles
{
"team_id": "string"
}
Webhook Management
create_webhook
Create a webhook
{
"team_id": "string",
"event_type": "string",
"callback_url": "string"
}
get_webhooks
List webhooks
{
"team_id": "string"
}
delete_webhook
Delete a webhook
{
"webhook_id": "string"
}
Usage Example
// Example using the MCP tool to get file information
<use_mcp_tool>
<server_name>figma</server_name>
<tool_name>get_file</tool_name>
<arguments>
{
"file_key": "your-file-key"
}
</arguments>
</use_mcp_tool>
License
MIT
Contributing
- Fork the repository
- Create your feature branch
- Commit your changes
- Push to the branch
- Create a new Pull Request
Servidores relacionados
Scout Monitoring MCP
patrocinadorPut performance and error data directly in the hands of your AI assistant.
Alpha Vantage MCP Server
patrocinadorAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
Kite Trading MCP Server
An MCP server for the Zerodha Kite Connect API, featuring fully automated authentication without manual token handling.
imgx-mcp
AI image generation and editing MCP server. Text-to-image, text-based editing with iterative refinement. Multi-provider (Gemini + OpenAI).
MiniMax MCP JS
A JavaScript/TypeScript server for MiniMax MCP, offering image/video generation, text-to-speech, and voice cloning.
MCP Server Starter
A TypeScript starter template for building Model Context Protocol (MCP) servers.
Glider
Roslyn-powered C# code analysis server for LLMs. Supports stdio and HTTP transports.
Remote MCP Server on Cloudflare (Authless)
An example of a remote MCP server deployable on Cloudflare Workers without authentication, featuring customizable tools.
Voiceflow MCP Client
A Node.js client that integrates with remote MCP servers to provide tools for Voiceflow Agents.
Yellhorn MCP
An MCP server that integrates Gemini 2.5 Pro and OpenAI models for software development tasks, allowing the use of your entire codebase as context.
Universal Crypto MCP
Enable AI agents to interact with any EVM blockchain through natural language.
Ollama
Integrates with Ollama to run local large language models. Requires a running Ollama instance.