GhidraMCP
Enables LLMs to autonomously reverse engineer applications by exposing core Ghidra functionality.
ghidraMCP
ghidraMCP is an Model Context Protocol server for allowing LLMs to autonomously reverse engineer applications. It exposes numerous tools from core Ghidra functionality to MCP clients.
https://github.com/user-attachments/assets/36080514-f227-44bd-af84-78e29ee1d7f9
Features
MCP Server + Ghidra Plugin
- Decompile and analyze binaries in Ghidra
- Automatically rename methods and data
- List methods, classes, imports, and exports
Installation
Prerequisites
Ghidra
First, download the latest release from this repository. This contains the Ghidra plugin and Python MCP client. Then, you can directly import the plugin into Ghidra.
- Run Ghidra
- Select
File->Install Extensions - Click the
+button - Select the
GhidraMCP-1-2.zip(or your chosen version) from the downloaded release - Restart Ghidra
- Make sure the GhidraMCPPlugin is enabled in
File->Configure->Developer - Optional: Configure the port in Ghidra with
Edit->Tool Options->GhidraMCP HTTP Server
Video Installation Guide:
https://github.com/user-attachments/assets/75f0c176-6da1-48dc-ad96-c182eb4648c3
MCP Clients
Theoretically, any MCP client should work with ghidraMCP. Three examples are given below.
Example 1: Claude Desktop
To set up Claude Desktop as a Ghidra MCP client, go to Claude -> Settings -> Developer -> Edit Config -> claude_desktop_config.json and add the following:
{
"mcpServers": {
"ghidra": {
"command": "python",
"args": [
"/ABSOLUTE_PATH_TO/bridge_mcp_ghidra.py",
"--ghidra-server",
"http://127.0.0.1:8080/"
]
}
}
}
Alternatively, edit this file directly:
/Users/YOUR_USER/Library/Application Support/Claude/claude_desktop_config.json
The server IP and port are configurable and should be set to point to the target Ghidra instance. If not set, both will default to localhost:8080.
Example 2: Cline
To use GhidraMCP with Cline, this requires manually running the MCP server as well. First run the following command:
python bridge_mcp_ghidra.py --transport sse --mcp-host 127.0.0.1 --mcp-port 8081 --ghidra-server http://127.0.0.1:8080/
The only required argument is the transport. If all other arguments are unspecified, they will default to the above. Once the MCP server is running, open up Cline and select MCP Servers at the top.
Then select Remote Servers and add the following, ensuring that the url matches the MCP host and port:
- Server Name: GhidraMCP
- Server URL:
http://127.0.0.1:8081/sse
Example 3: 5ire
Another MCP client that supports multiple models on the backend is 5ire. To set up GhidraMCP, open 5ire and go to Tools -> New and set the following configurations:
- Tool Key: ghidra
- Name: GhidraMCP
- Command:
python /ABSOLUTE_PATH_TO/bridge_mcp_ghidra.py
Building from Source
- Copy the following files from your Ghidra directory to this project's
lib/directory:
Ghidra/Features/Base/lib/Base.jarGhidra/Features/Decompiler/lib/Decompiler.jarGhidra/Framework/Docking/lib/Docking.jarGhidra/Framework/Generic/lib/Generic.jarGhidra/Framework/Project/lib/Project.jarGhidra/Framework/SoftwareModeling/lib/SoftwareModeling.jarGhidra/Framework/Utility/lib/Utility.jarGhidra/Framework/Gui/lib/Gui.jar
- Build with Maven by running:
mvn clean package assembly:single
The generated zip file includes the built Ghidra plugin and its resources. These files are required for Ghidra to recognize the new extension.
- lib/GhidraMCP.jar
- extensions.properties
- Module.manifest
Related Servers
Scout Monitoring MCP
sponsorPut performance and error data directly in the hands of your AI assistant.
Alpha Vantage MCP Server
sponsorAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
MCP for Docs
Automatically downloads and converts documentation from various sources into organized markdown files.
Screeny
A macOS-only server that enables LLMs to capture screenshots of specific application windows, providing visual context for development and debugging.
BlenderMCP
Connects Blender to Claude AI via the Model Context Protocol (MCP), enabling direct interaction and control for prompt-assisted 3D modeling, scene creation, and manipulation.
Remote MCP Server on Cloudflare
A template for deploying a remote MCP server on Cloudflare Workers, allowing for custom tool integration.
Webflow
Interact with the Webflow API to manage sites, collections, and items.
Hashnode MCP Server
An MCP server for interacting with the Hashnode API.
Gentoro
Gentoro generates MCP Servers based on OpenAPI specifications.
Praison AI
AI Agents framework with 64+ built-in MCP tools for search, memory, workflows, code execution, and file operations. Install via `uvx praisonai-mcp`
pabal-store-api-mcp
MCP server that provides App Store / Play Store ASO workflows as tools.
Riza
Arbitrary code execution and tool-use platform for LLMs by Riza