Integrates Zeek network analysis with conversational AI clients. Requires an external Zeek installation.
This repository provides a set of utilities to build an MCP server (Model Context Protocol) that you can integrate with your conversational AI client.
PATH
(for the execzeek
tool)git clone https://github.com/Gabbo01/Zeek-MCP
cd Zeek-MCP
It's recommended to use a virtual environment:
python -m venv venv
source venv/bin/activate # Linux/macOS
venv\Scripts\activate # Windows
pip install -r requirements.txt
Note: If you don’t have a
requirements.txt
, install directly:pip install pandas mcp
The repository exposes two main MCP tools and a command-line entry point:
python Bridge_Zeek_MCP.py --mcp-host 127.0.0.1 --mcp-port 8081 --transport sse
--mcp-host
: Host for the MCP server (default: 127.0.0.1
).--mcp-port
: Port for the MCP server (default: 8081
).--transport
: Transport protocol, either sse
(Server-Sent Events) or stdio
.You need to use an LLM that can support the MCP tools usage by calling the following tools:
execzeek(pcap_path: str) -> str
.log
files in the working directory..log
filenames or "1"
on error.parselogs(logfile: str) -> DataFrame
.log
file and returns the parsed content.You can interact with these endpoints via HTTP (if using SSE transport) or by embedding in LLM client (eg: Claude Desktop):
To set up Claude Desktop as a Zeek MCP client, go to Claude
-> Settings
-> Developer
-> Edit Config
-> claude_desktop_config.json
and add the following:
{
"mcpServers": {
"Zeek-mcp": {
"command": "python",
"args": [
"/ABSOLUTE_PATH_TO/Bridge_Zeek_MCP.py",
]
}
}
}
Alternatively, edit this file directly:
/Users/YOUR_USER/Library/Application Support/Claude/claude_desktop_config.json
Another MCP client that supports multiple models on the backend is 5ire. To set up Zeek-MCP, open 5ire and go to Tools
-> New
and set the following configurations:
python /ABSOLUTE_PATH_TO/Bridge_Zeek_MCP.py
An example of MCP tools usage from a chainlit chatbot client, it was used an example pcap file (you can find fews in pcaps folder)
In that case the used model was claude-3.7-sonnet-reasoning-gemma3-12b
See LICENSE
for more information.
A Retrieval-Augmented Generation (RAG) server for document processing, vector storage, and intelligent Q&A, powered by the Model Context Protocol.
An MCP server for AI-assisted frontend development using Chrome DevTools. Requires Google Chrome.
A reverse proxy gateway for managing and accessing multiple MCP servers through a single entry point, deployable via Docker.
A quantum circuit simulator with noise models and OpenQASM 2.0 support, accessible via the Model Context Protocol (MCP).
A remote MCP server deployable on Cloudflare Workers with OAuth login support and local development capabilities.
Equip AI agents with evaluation and self-improvement capabilities with Root Signals.
An MCP server for the Arduino CLI, offering tools to manage sketches, boards, libraries, and files.
A collection of MCP servers designed for rapid prototyping in CS experimentation workshops.
An image generation server that connects to a local ComfyUI instance via its API, supporting dynamic workflows.
Retrieves essential network information from devices using gNMI and OpenConfig models.