Blender AI MCP
Modular MCP Server + Blender Addon for AI-Driven 3D Modeling.
blender-ai-mcp
A production-shaped MCP server for Blender.
blender-ai-mcp lets Claude, ChatGPT, Codex, and other MCP clients control Blender through a stable tool API instead of ad-hoc Python generation. The result is a safer, smaller, and more reliable surface for real modeling work: goal-first routing, curated public tools, deterministic inspection, and verification that does not depend on guesswork.
Why This Exists
Most "AI + Blender" setups still ask the model to write raw bpy scripts. That breaks exactly where production work gets interesting:
- Blender APIs drift across versions.
- Context-sensitive operators fail when the active object, mode, or selection is wrong.
- Raw scripts give weak feedback when something goes wrong.
- Vision can describe a result, but it cannot be trusted as the final authority.
blender-ai-mcp takes the opposite approach: treat Blender control as a product surface, not a code-generation stunt.
Why This MCP Server Instead of Raw Python
- Stable contracts over script synthesis. The model calls tools with validated parameters instead of improvising Blender code.
- Goal-first orchestration. Normal guided sessions start from
router_set_goal(...), so the system knows what the model is trying to build before it starts calling low-level actions. - Small public surface. The default
llm-guidedprofile exposes a tiny, search-first bootstrap layer instead of flooding the model with the whole runtime inventory. - Truth-first verification. Inspection, measurement, and assertion tools determine what is actually true in Blender.
- Safe execution boundaries. The Blender addon executes operations on Blender's main thread while the MCP server handles routing, validation, discovery, and structured responses.
The Product Approach
The business idea formalized in TASK-113 is simple:
- Atomic tools are the implementation substrate. They stay small, precise, and mostly hidden from the normal public surface.
- Macro tools are the preferred LLM-facing layer for meaningful task-sized work.
- Workflow tools are bounded multi-step process tools with explicit reporting, not open-ended "do anything" endpoints.
- Goal-first orchestration keeps sessions anchored to an active intent instead of making the model rediscover context on every turn.
- Vision assists interpretation, while deterministic measurement and assertions provide the final truth layer.
- Pluggable vision runtimes now cover local MLX plus external OpenRouter and Google AI Studio / Gemini provider paths behind the same bounded contract.
This is what turns the project from "Blender tools exposed over MCP" into a usable AI control product for modeling pipelines.
LLM-Guided Public Surface
llm-guided is the default production-oriented surface. It is intentionally small, search-first, and designed around goal-aware sessions.
Normal guided flow:
router_set_goal(...)browse_workflows,search_tools, orcall_tool- use grouped/public tools such as
check_scene,inspect_scene, orconfigure_scene - verify with inspection plus
scene_measure_*andscene_assert_*
When a bounded modeling intent matches, the default public working layer should be the macro layer:
macro_cutout_recessfor recesses, openings, and cutter-driven cutoutsmacro_relative_layoutfor align/place/contact-gap part layoutmacro_finish_formfor preset-driven bevel/subdivision/solidify finishingreference_imagesfor goal-scoped reference intake before bounded visual comparisonreference_compare_stage_checkpointfor deterministic multi-view stage comparison against attached references during manual iterative workreference_iterate_stage_checkpointfor a session-aware staged correction loop that remembers prior focus, can escalate into inspect/validate when the same correction repeats, and can now target one object, many objects, a collection, or the full assembled silhouette
Current guided bootstrap surface:
router_set_goalrouter_get_statusbrowse_workflowsreference_imagessearch_toolscall_toollist_promptsget_prompt
Current guided utility prep path:
- bootstrap/planning search can now reach:
scene_get_viewportscene_clean_scene
- these utility actions stay bounded and do not reopen the full legacy surface
- build goals should still start from
router_set_goal(...), but screenshot / viewport / scene-reset requests should use the guided utility path instead
Current public aliases on llm-guided:
| Internal tool | llm-guided public name | Public arg changes |
|---|---|---|
scene_context | check_scene | action -> query |
scene_inspect | inspect_scene | object_name -> target_object |
scene_configure | configure_scene | settings -> config |
workflow_catalog | browse_workflows | workflow_name -> name, query -> search_query |
Why that matters:
- the guided profile starts from 8 visible tools instead of the full catalog
- grouped/public tools stay easy to discover
- hidden atomic tools remain available as infrastructure, not as the default public mental model
- specialist families stay out of the normal guided entry layer until the macro surface is broader
Atomic Foundations And Docs
The root README.md is intentionally not the full tool catalog anymore.
The detailed tool inventory and atomic family docs should stay in docs, not on the front page. That is the right long-term structure after TASK-113.
Use these docs depending on what you need:
- Tool Layering Policy
- Canonical policy for
atomic / macro / workflow, hidden atomic tools, goal-first usage, and vision/assert boundaries.
- Canonical policy for
- MCP Server Docs
- Surface profiles, guided aliases, versioned contracts, and runtime/platform guidance.
- MCP Client Config Examples
- Ready-to-paste local MCP client config examples for guided/manual surfaces plus MLX, OpenRouter, and Gemini vision variants.
- Vision Layer Docs
- Runtime/backends, capture bundles, reference images, macro/workflow vision integration notes, and repo-tracked real viewport eval bundles for both direct user-view and fixed camera-perspective captures.
- Available Tools Summary
- Full inventory and grouped/public tool overview.
- Tool Architecture Index
- Maintainer-facing map of the tool families underneath the MCP surface.
If you want to see the atomic families the server is built on, start here:
Recommended interpretation:
- keep
/_docs/TOOLS/as the maintainer-facing atomic/grouped architecture map - keep
README.mdproduct-facing and compact - keep
/_docs/AVAILABLE_TOOLS_SUMMARY.mdas the runtime inventory
Provider Notes
Current short version:
- Local default:
mlx_localwith a Qwen VL 4B-class model path; current repo-validated baseline ismlx-community/Qwen3-VL-4B-Instruct-4bit - External iterative compare candidate: OpenRouter with
x-ai/grok-4.20-multi-agent - External experimental path: Google AI Studio / Gemini currently needs a provider-specific structured-output contract for harder staged compare flows
Detailed per-provider table:
Architecture
The system is split on purpose:
- MCP server (
server/): FastMCP surface, public tool definitions, transforms, discovery, and response contracts. - Router (
server/router/): goal interpretation, safety/correction policy, workflow matching, session context, and guided execution behavior. - Blender addon (
blender_addon/): actualbpyexecution, RPC handlers, and Blender main-thread-safe operation scheduling.
Communication happens through JSON-RPC over TCP sockets.
More detail:
Structured Contract Baseline
The server is moving critical surfaces toward machine-readable payloads instead of prose-heavy JSON strings.
Current structured-contract baseline includes:
macro_cutout_recessmacro_finish_formmacro_relative_layoutscene_createscene_configuremesh_selectmesh_select_targetedmesh_inspectscene_snapshot_statescene_compare_snapshotscene_measure_distancescene_measure_dimensionsscene_measure_gapscene_measure_alignmentscene_measure_overlapscene_assert_contactscene_assert_dimensionsscene_assert_containmentscene_assert_symmetryscene_assert_proportionrouter_set_goalrouter_get_statusworkflow_catalog
That is important for automation, auditing, and future macro/workflow composition.
Structured Clarification Flow
The guided surface supports missing-input handling as part of the product contract, not as an afterthought.
- Model-first clarification is the default for
router_set_goal(...)onllm-guided: missing workflow parameters return a typedneeds_inputpayload to the outer model first. - Typed fallback payloads keep the same flow usable on tool-only or compatibility clients.
- Human/native clarification is reserved for later/fallback policy rather than the default first step of workflow execution.
router_set_goal(...)can ask for constrained choices, booleans, enums, or workflow confirmation.partial answerssurvive across follow-up turns.workflow_catalogimport conflicts reuse the same clarification model.
Guided Handoff Contract
The guided surface now treats workflow fallback as an explicit typed contract instead of a phase side effect hidden in prose.
router_set_goal(...)returnsguided_handoffon bounded continuation paths such ascontinuation_mode="guided_manual_build"andcontinuation_mode="guided_utility".guided_handoffnames thetarget_phase,direct_tools,supporting_tools, anddiscovery_toolsfor the next step onllm-guided.workflow_import_recommendedstaysFalseon these fallback paths unless the user explicitly asks for workflow import/create behavior.router_get_status(...)preserves the activeguided_handoffin session diagnostics so clients can recover the intended continuation path.
Server-Side Sampling Assistants Baseline
The MCP server now has a bounded analytical assistant layer inside an active request.
Current use cases:
- optional
assistant_summaryon inspection-heavy paths such asscene_snapshot_state,scene_compare_snapshot,scene_get_hierarchy,scene_get_bounding_box, andscene_get_origin_info - bounded
repair_suggestiononrouter_set_goal,router_get_status, andworkflow_catalog
Explicit assistant terminal states:
successunavailablemasked_errorrejected_by_policy
The rule is strict: assistants may help summarize or suggest, but they do not override scene truth or router policy.
Versioned Surface Baseline
Public surface evolution is versioned explicitly:
| Surface profile | Default contract line |
|---|---|
legacy-manual | legacy-v1 |
legacy-flat | legacy-v1 |
llm-guided | llm-guided-v2 |
Compatibility note:
llm-guided-v1remains selectable as a rollback lineworkflow_catalog,scene_context, andscene_inspectparticipate in the guided surface evolution story
Code Mode Decision
Current benchmark baselines:
legacy-flatllm-guidedcode-mode-pilot
Current decision:
- Go decision: keep
code-mode-pilotas an experimental read-only surface - Do not make Code Mode the default path for write-heavy or geometry-destructive Blender work
Support Matrix
- Blender: tested on Blender 5.0 in E2E coverage; addon minimum remains Blender 4.0+ on a best-effort basis.
- Python: 3.11+
- FastMCP task runtime: fastmcp 3.1.1 + pydocket 0.18.2
- OS: macOS / Windows / Linux
- Memory: router semantic features rely on a local LaBSE model and related vector infrastructure
Quick Start
1. Install the Blender addon
- Download
blender_ai_mcp.zipfrom the Releases page or build it locally withpython scripts/build_addon.py. - Open Blender -> Edit -> Preferences -> Add-ons.
- Click Install... and select the zip file.
- Enable the addon. It starts the local Blender RPC server on port
8765.
2. Run the MCP server on the guided profile
Recommended defaults:
ROUTER_ENABLED=trueMCP_SURFACE_PROFILE=llm-guided- map
/tmpif you want host-visible image/file outputs
Example Docker command:
docker run -i --rm \
-v /tmp:/tmp \
-e BLENDER_AI_TMP_INTERNAL_DIR=/tmp \
-e BLENDER_AI_TMP_EXTERNAL_DIR=/tmp \
-e ROUTER_ENABLED=true \
-e MCP_SURFACE_PROFILE=llm-guided \
-e BLENDER_RPC_HOST=host.docker.internal \
ghcr.io/patrykiti/blender-ai-mcp:latest
Example generic MCP client config:
{
"mcpServers": {
"blender-ai-mcp": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-v", "/tmp:/tmp",
"-e", "BLENDER_AI_TMP_INTERNAL_DIR=/tmp",
"-e", "BLENDER_AI_TMP_EXTERNAL_DIR=/tmp",
"-e", "ROUTER_ENABLED=true",
"-e", "MCP_SURFACE_PROFILE=llm-guided",
"-e", "BLENDER_RPC_HOST=host.docker.internal",
"ghcr.io/patrykiti/blender-ai-mcp:latest"
]
}
}
}
Network notes:
- macOS / Windows: use
host.docker.internal - Linux: prefer
--network hostwithBLENDER_RPC_HOST=127.0.0.1
For broader profile/config examples, use:
Testing
Unit tests:
PYTHONPATH=. poetry run pytest tests/unit/ -v
Unit collection count:
poetry run pytest tests/unit --collect-only
E2E tests:
python3 scripts/run_e2e_tests.py
E2E collection count:
poetry run pytest tests/e2e --collect-only
Pre-commit:
poetry run pre-commit install --hook-type pre-commit --hook-type pre-push
poetry run pre-commit run --all-files
More detail:
Documentation Map
- Architecture
- MCP Server Docs
- Router Docs
- Router Responsibility Boundaries
- Addon Docs
- Available Tools Summary
- Tool Architecture Index
- Prompts
- Tasks
Contributing
Read CONTRIBUTING.md before opening a PR. The repo enforces Clean Architecture boundaries, typed Python, router metadata rules, and pre-commit validation.
Community And Support
If blender-ai-mcp is useful in your workflow, consider sponsoring its long-term development.
Sponsorship helps fund maintenance, docs, testing, and the higher-level reliability work that makes this repo different from raw Blender code generation: goal-first routing, curated tools, deterministic verification, and production-shaped workflow support.
Author
Patryk Ciechański
- GitHub: PatrykIti
License
This project is licensed under the Apache License 2.0.
See:
Related Servers
MCP Personal Assistant Agent
A versatile AI personal assistant for managing your calendar, tasks, emails, web searches, and smart home.
MCP Kanban Memory
Manage complex AI agent workflows with a Kanban-based task management system.
Jira & Confluence MCP Servers
MCP servers for interacting with Jira and Confluence APIs.
Video Editor MCP Server
Perform video editing operations using natural language commands via FFmpeg.
eSignatures
Contract and template management for drafting, reviewing, and sending binding contracts.
Google Contacts
Manage your Google Contacts, allowing you to create, search, and update contacts.
Notion
Connects AI assistants to your Notion workspace to search and manage pages, databases, and content.
Trello
Manage and interact with Trello boards, lists, and cards.
Norman Finance
MCP server for managing accounting and taxes with Norman Finance.
C++ Excel Automation
A C++ based MCP server for intelligent Excel automation using the OpenXLSX library.