MCP Ai server for Visual Studio

Visual Studio extension with 20 Roslyn-powered MCP tools for AI assistants. Semantic code navigation, symbol search, inheritance, call graphs, safe rename, build/test.

MCP AI Server for Visual Studio

Other tools read your files. MCP AI Server understands your code.

Install from Marketplace VS 2022 Free Tools Website

The first and only Visual Studio extension that gives AI assistants access to the C# compiler (Roslyn) and the Visual Studio Debugger through the Model Context Protocol. 41 tools. 13 powered by Roslyn. 19 debugging tools (Preview). Semantic understanding, not text matching.

Install

Install from Visual Studio Marketplace

Or search for "MCP AI Server" in Visual Studio Extensions Manager.

See it in Action

Watch Demo Video

Click to watch on YouTube


Why MCP AI Server?

AI coding tools (Claude Code, Codex CLI, Gemini CLI, OpenCode, Cursor, Copilot, Windsurf...) operate at the filesystem level — they read text, run grep, execute builds. They don't understand your code the way Visual Studio does.

MCP AI Server bridges this gap by exposing IntelliSense-level intelligence and the Visual Studio Debugger as MCP tools. Your AI assistant gets the same semantic understanding that powers F12 (Go to Definition), Shift+F12 (Find All References), safe refactoring — and now runtime debugging.

What becomes possible:

You askWithout MCP AI ServerWith MCP AI Server
"Find WhisperFactory"grep returns 47 matchesClass, Whisper.net, line 16 — one exact answer
"Rename ProcessDocument"sed breaks ProcessDocumentAsyncRoslyn renames 23 call sites safely
"What implements IDocumentService?"Impossible via grepFull inheritance tree with interfaces
"What calls AuthenticateUser?"Text matches, can't tell directionPrecise call graph: callers + callees
"Why is ProcessOrder returning null?"Reads code, guessesSets breakpoint, inspects actual runtime values

22 Stable Tools

Semantic Navigation (Roslyn-powered)

ToolDescription
FindSymbolsFind classes, methods, properties by name — semantic, not text
FindSymbolDefinitionGo to definition (F12 equivalent)
FindSymbolUsagesFind all references, compiler-verified (Shift+F12 equivalent)
GetSymbolAtLocationIdentify the symbol at a specific line and column
GetDocumentOutlineSemantic structure: classes, methods, properties, fields

Code Understanding (Roslyn-powered)

ToolDescription
GetInheritanceFull type hierarchy: base types, derived types, interfaces
GetMethodCallersWhich methods call this method (call graph UP)
GetMethodCallsWhich methods this method calls (call graph DOWN)

Code Analysis (Roslyn-powered)

ToolDescription
GetDiagnosticsCompiler errors & warnings without building — Roslyn background analysis

Refactoring (Roslyn-powered)

ToolDescription
RenameSymbolSafe rename across the entire solution — compiler-verified
FormatDocumentVisual Studio's native code formatter

Project & Build

ToolDescription
ExecuteCommandBuild or clean solution/project with structured diagnostics
ExecuteAsyncTestRun tests asynchronously with real-time status
GetSolutionTreeSolution and project structure
GetProjectReferencesProject dependency graph
LoadSolutionOpen a .sln/.slnx file — server stays on the same port
TranslatePathConvert paths between Windows and WSL formats

Editor Integration

ToolDescription
GetActiveFileCurrent file and cursor position
GetSelection / CheckSelectionRead active text selection
GetLoggingStatus / SetLogLevelExtension diagnostics

19 Debugging Tools (Preview)

Your AI assistant can now debug your .NET code at runtime through the Visual Studio Debugger. Set breakpoints, step through code, inspect variables, attach to Docker containers and WSL processes.

Debug Control (10 tools)

ToolDescription
debug_startStart debugging (F5). Fire-and-forget
debug_stopStop debugging session
debug_get_modeCurrent mode: Design, Running, or Break
debug_breakPause the running application
debug_continueResume execution
debug_stepStep over/into/out
immediate_executeExecute expression with side effects
debug_list_transportsList transports (Default, Docker, WSL, SSH...)
debug_list_processesList processes on a transport
debug_attachAttach to a running process

Debug Inspection (5 tools)

ToolDescription
debug_get_callstackCall stack of current thread
debug_get_localsLocal variables (tree-navigable)
debug_evaluateEvaluate expression / drill into variable tree
output_readRead VS Output window (Build, Debug, Tests)
error_list_getErrors and warnings from VS Error List

Breakpoint Management (4 tools)

ToolDescription
breakpoint_setSet breakpoint by file+line or function name
breakpoint_removeRemove breakpoint
breakpoint_listList all breakpoints
exception_settings_setConfigure break-on-exception

AI Debugging Guide

Complete reference for AI agents — all 19 tools, 10 workflows, Docker & WSL setup, polling patterns, and best practices.

Download AI Debugging Guide (.md) — add it to your AI's context for full debugging capabilities.


Compatible Clients

Works with any MCP-compatible AI tool:

CLI Agents:

  • Claude Code — Anthropic's terminal AI coding agent
  • Codex CLI — OpenAI's terminal coding agent
  • Gemini CLI — Google's open-source terminal agent
  • OpenCode — Open-source AI coding agent (45k+ GitHub stars)
  • Goose — Block's open-source AI agent
  • Aider — AI pair programming in terminal

Desktop & IDE:

  • Claude Desktop — Anthropic's desktop app
  • Cursor — AI-first code editor
  • Windsurf — Codeium's AI IDE
  • VS Code + Copilot — GitHub Copilot with MCP
  • Cline — VS Code extension
  • Continue — Open-source AI assistant

Any MCP client — open protocol, universal compatibility


Quick Start

  1. Install from Visual Studio Marketplace
  2. Open your .NET solution in Visual Studio
  3. Configure port in MCP Server Settings (default: 3010)
  4. Add to your MCP client:
"vs-mcp": {
  "type": "http",
  "url": "http://localhost:3010/sdk/"
}
  1. Start asking your AI semantic questions about your code

MCP Client Configuration

AI ToolTypeConfig File
Claude CodeCLICLAUDE.md
Claude DesktopAppclaude_desktop_config.json
CursorIDE.cursor/rules/*.mdc or .cursorrules
WindsurfIDE.windsurfrules
ClineVS Code Extension.clinerules/ directory
VS Code + CopilotIDE.github/copilot-instructions.md
ContinueIDE Extension.continue/config.json
Gemini CLICLIGEMINI.md
OpenAI Codex CLICLIAGENTS.md
GooseCLI.goose/config.yaml

Any tool supporting Model Context Protocol will work.


Configure AI Preferences (Recommended)

Add these instructions to your project's AI config file (see table above) to ensure your AI automatically prefers MCP tools:

## MCP Tools - ALWAYS PREFER

When `mcp__vs-mcp__*` tools are available, ALWAYS use them instead of Grep/Glob/LS:

| Instead of | Use |
|------------|-----|
| `Grep` for symbols | `FindSymbols`, `FindSymbolUsages` |
| `LS` to explore projects | `GetSolutionTree` |
| Reading files to find code | `FindSymbolDefinition` then `Read` |
| Searching for method calls | `GetMethodCallers`, `GetMethodCalls` |

**Why?** MCP tools use Roslyn semantic analysis - 10x faster, 90% fewer tokens.

Multi-Solution: Understand Your Dependencies

Need the AI to understand a library you depend on? Clone the source from GitHub, open it in a second Visual Studio — each instance runs its own MCP server on a configurable port.

Your project                          → port 3010
Library source (cloned from GitHub)   → port 3011
Framework source                      → port 3012

Your AI connects to all of them. It can trace calls, find usage patterns, and understand inheritance across your code and library code.

MCP client configuration — three options depending on your client:

Clients with native HTTP support (Claude Desktop, Claude Code):

{
  "mcpServers": {
    "vs-mcp": {
      "type": "http",
      "url": "http://localhost:3010/sdk/"
    },
    "vs-mcp-whisper": {
      "type": "http",
      "url": "http://localhost:3011/sdk/"
    }
  }
}

Clients without HTTP support (via mcp-remote proxy):

{
  "mcpServers": {
    "vs-mcp": {
      "command": "npx",
      "args": ["-y", "mcp-remote", "http://localhost:3010/sdk/"]
    },
    "vs-mcp-whisper": {
      "command": "npx",
      "args": ["-y", "mcp-remote", "http://localhost:3011/sdk/"]
    }
  }
}

Codex CLI (TOML):

[mcp_servers.vs-mcp]
type = "stdio"
command = "npx"
args = ["mcp-remote", "http://localhost:3010/sdk/"]

[mcp_servers.vs-mcp-whisper]
type = "stdio"
command = "npx"
args = ["mcp-remote", "http://localhost:3011/sdk/"]

Settings

ToolsMCP Server Settings

VS-MCP Settings

SettingDefaultDescription
Port3001Server port (configurable per VS instance)
Path FormatWSLOutput as /mnt/c/... or C:\...
ToolsAll enabledEnable/disable individual tool groups

Changes apply on next server start.


vs. Other Approaches

ApproachSymbol SearchInheritanceCall GraphSafe RenameDebugging
MCP AI Server (Roslyn + Debugger)SemanticFull treeCallers + CalleesCompiler-verifiedBreakpoints + Step + Inspect
AI Agent (grep/fs)Text matchNoNoText replaceNo
Other MCP serversNoNoNoNoNo

Requirements

  • Visual Studio 2022 (17.13+) or Visual Studio 2026
  • Windows (amd64 or arm64)
  • Any MCP-compatible AI tool

Issues & Feature Requests

Found a bug or have an idea? Open an issue!


About

0ics srl — Italian software company specializing in AI-powered development tools. Part of the example4.ai ecosystem.

Built by Ladislav Sopko — 30 years of software development, from assembler to enterprise .NET.


MCP AI Server: Because your AI deserves the same intelligence as your IDE.

Related Servers