Hayhooks
Deploy and serve Haystack pipelines as REST APIs, MCP Tools, and OpenAI-compatible chat completion backends.
Hayhooks
Hayhooks makes it easy to deploy and serve Haystack Pipelines and Agents.
With Hayhooks, you can:
- 📦 Deploy your Haystack pipelines and agents as REST APIs with maximum flexibility and minimal boilerplate code.
- 🛠️ Expose your Haystack pipelines and agents over the MCP protocol, making them available as tools in AI dev environments like Cursor or Claude Desktop. Under the hood, Hayhooks runs as an MCP Server, exposing each pipeline and agent as an MCP Tool.
- 💬 Integrate your Haystack pipelines and agents with Open WebUI as OpenAI-compatible chat completion backends with streaming support.
- 🖥️ Embed a Chainlit chat UI directly in Hayhooks with
pip install "hayhooks[chainlit]"andhayhooks run --with-chainlit-- zero-configuration frontend with streaming, pipeline selection, and custom UI widgets. - 🕹️ Control Hayhooks core API endpoints through chat - deploy, undeploy, list, or run Haystack pipelines and agents by chatting with Claude Desktop, Cursor, or any other MCP client.
Documentation
📚 For detailed guides, examples, and API reference, check out our comprehensive documentation.
Quick Start
1. Install Hayhooks
# Install Hayhooks
pip install hayhooks
2. Start Hayhooks
hayhooks run
3. Create a simple agent
Create a minimal agent wrapper with streaming chat support and a simple HTTP POST API:
from typing import AsyncGenerator
from haystack.components.agents import Agent
from haystack.dataclasses import ChatMessage
from haystack.tools import Tool
from haystack.components.generators.chat import OpenAIChatGenerator
from hayhooks import BasePipelineWrapper, async_streaming_generator
# Define a Haystack Tool that provides weather information for a given location.
def weather_function(location):
return f"The weather in {location} is sunny."
weather_tool = Tool(
name="weather_tool",
description="Provides weather information for a given location.",
parameters={
"type": "object",
"properties": {"location": {"type": "string"}},
"required": ["location"],
},
function=weather_function,
)
class PipelineWrapper(BasePipelineWrapper):
def setup(self) -> None:
self.agent = Agent(
chat_generator=OpenAIChatGenerator(model="gpt-4o-mini"),
system_prompt="You're a helpful agent",
tools=[weather_tool],
)
# This will create a POST /my_agent/run endpoint
# `question` will be the input argument and will be auto-validated by a Pydantic model
async def run_api_async(self, question: str) -> str:
result = await self.agent.run_async(messages=[ChatMessage.from_user(question)])
return result["last_message"].text
# This will create an OpenAI-compatible /chat/completions endpoint
async def run_chat_completion_async(
self, model: str, messages: list[dict], body: dict
) -> AsyncGenerator[str, None]:
chat_messages = [
ChatMessage.from_openai_dict_format(message) for message in messages
]
return async_streaming_generator(
pipeline=self.agent,
pipeline_run_args={
"messages": chat_messages,
},
)
Save as my_agent_dir/pipeline_wrapper.py.
4. Deploy it
hayhooks pipeline deploy-files -n my_agent ./my_agent_dir
5. Run it
Call the HTTP POST API (/my_agent/run):
curl -X POST http://localhost:1416/my_agent/run \
-H 'Content-Type: application/json' \
-d '{"question": "What can you do?"}'
Call the OpenAI-compatible chat completion API (streaming enabled):
curl -X POST http://localhost:1416/chat/completions \
-H 'Content-Type: application/json' \
-d '{
"model": "my_agent",
"messages": [{"role": "user", "content": "What can you do?"}]
}'
Or chat with it in the embedded Chainlit UI (hayhooks run --with-chainlit) or integrate it with Open WebUI!
Key Features
🚀 Easy Deployment
- Deploy Haystack pipelines and agents as REST APIs with minimal setup
- Support for both YAML-based and wrapper-based pipeline deployment
- Automatic OpenAI-compatible endpoint generation
🌐 Multiple Integration Options
- MCP Protocol: Expose pipelines as MCP tools for use in AI development environments
- Chainlit UI: Embedded chat frontend with streaming, pipeline selection, and custom UI widgets
- Open WebUI Integration: Use Hayhooks as a backend for Open WebUI with streaming support
- OpenAI Compatibility: Seamless integration with OpenAI-compatible tools and frameworks
🔧 Developer Friendly
- CLI for easy pipeline management
- Flexible configuration options
- Comprehensive logging and debugging support
- Custom route and middleware support
📁 File Upload Support
- Built-in support for handling file uploads in pipelines
- Perfect for RAG systems and document processing
Next Steps
- Quick Start Guide - Get started with Hayhooks
- Installation - Install Hayhooks and dependencies
- Configuration - Configure Hayhooks for your needs
- Examples - Explore example implementations
Community & Support
- GitHub: deepset-ai/hayhooks
- Issues: GitHub Issues
- Documentation: Full Documentation
Hayhooks is actively maintained by the deepset team.
Serveurs connexes
Scout Monitoring MCP
sponsorPut performance and error data directly in the hands of your AI assistant.
Alpha Vantage MCP Server
sponsorAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
Pharo NeoConsole
Evaluate Pharo Smalltalk expressions and get system information via a local NeoConsole server.
Jenkins Server MCP
A tool for interacting with Jenkins CI/CD servers, requiring environment variables for configuration.
SSH Rails Runner
Execute Rails console commands remotely and securely over SSH.
Godot MCP Pro
Premium MCP server for Godot game engine with 84 AI-powered tools for scene editing, scripting, animation, tilemap, shader, input simulation, and runtime debugging.
ServiceNow
A production-ready Model Context Protocol (MCP) server for ServiceNow platform integration. Built with TypeScript for Node.js 20+, this server enables LLMs and AI assistants to interact with ServiceNow instances through a standardized interface.
Data Structure Protocol (DSP)
Graph-based long-term memory skill for AI (LLM) coding agents — faster context, fewer tokens, safer refactors
Codebase MCP Server
A server for secure and efficient codebase analysis.
ABP.IO MCP Server
An MCP server for ABP.IO that enables AI models to interact with your ABP applications and framework.
SonarQube MCP Server
Integrates with SonarQube to provide AI assistants with access to code quality metrics, issues, and analysis results.
Multichain MCP Server
A toolkit for building and deploying AI agents with blockchain capabilities, featuring a Model Context Protocol (MCP) server.