Hayhooks
Deploy and serve Haystack pipelines as REST APIs, MCP Tools, and OpenAI-compatible chat completion backends.
Hayhooks
Hayhooks makes it easy to deploy and serve Haystack Pipelines and Agents.
With Hayhooks, you can:
- đĻ Deploy your Haystack pipelines and agents as REST APIs with maximum flexibility and minimal boilerplate code.
- đ ī¸ Expose your Haystack pipelines and agents over the MCP protocol, making them available as tools in AI dev environments like Cursor or Claude Desktop. Under the hood, Hayhooks runs as an MCP Server, exposing each pipeline and agent as an MCP Tool.
- đŦ Integrate your Haystack pipelines and agents with Open WebUI as OpenAI-compatible chat completion backends with streaming support.
- đšī¸ Control Hayhooks core API endpoints through chat - deploy, undeploy, list, or run Haystack pipelines and agents by chatting with Claude Desktop, Cursor, or any other MCP client.
Documentation
đ For detailed guides, examples, and API reference, check out our comprehensive documentation.
Quick Start
1. Install Hayhooks
# Install Hayhooks
pip install hayhooks
2. Start Hayhooks
hayhooks run
3. Create a simple agent
Create a minimal agent wrapper with streaming chat support and a simple HTTP POST API:
from typing import AsyncGenerator
from haystack.components.agents import Agent
from haystack.dataclasses import ChatMessage
from haystack.tools import Tool
from haystack.components.generators.chat import OpenAIChatGenerator
from hayhooks import BasePipelineWrapper, async_streaming_generator
# Define a Haystack Tool that provides weather information for a given location.
def weather_function(location):
return f"The weather in {location} is sunny."
weather_tool = Tool(
name="weather_tool",
description="Provides weather information for a given location.",
parameters={
"type": "object",
"properties": {"location": {"type": "string"}},
"required": ["location"],
},
function=weather_function,
)
class PipelineWrapper(BasePipelineWrapper):
def setup(self) -> None:
self.agent = Agent(
chat_generator=OpenAIChatGenerator(model="gpt-4o-mini"),
system_prompt="You're a helpful agent",
tools=[weather_tool],
)
# This will create a POST /my_agent/run endpoint
#Â `question` will be the input argument and will be auto-validated by a Pydantic model
async def run_api_async(self, question: str) -> str:
result = await self.agent.run_async({"messages": [ChatMessage.from_user(question)]})
return result["replies"][0].text
# This will create an OpenAI-compatible /chat/completions endpoint
async def run_chat_completion_async(
self, model: str, messages: list[dict], body: dict
) -> AsyncGenerator[str, None]:
chat_messages = [
ChatMessage.from_openai_dict_format(message) for message in messages
]
return async_streaming_generator(
pipeline=self.agent,
pipeline_run_args={
"messages": chat_messages,
},
)
Save as my_agent_dir/pipeline_wrapper.py.
4. Deploy it
hayhooks pipeline deploy-files -n my_agent ./my_agent_dir
5. Run it
Call the HTTP POST API (/my_agent/run):
curl -X POST http://localhost:1416/my_agent/run \
-H 'Content-Type: application/json' \
-d '{"question": "What can you do?"}'
Call the OpenAI-compatible chat completion API (streaming enabled):
curl -X POST http://localhost:1416/chat/completions \
-H 'Content-Type: application/json' \
-d '{
"model": "my_agent",
"messages": [{"role": "user", "content": "What can you do?"}]
}'
Or integrate it with Open WebUI and start chatting with it!
Key Features
đ Easy Deployment
- Deploy Haystack pipelines and agents as REST APIs with minimal setup
- Support for both YAML-based and wrapper-based pipeline deployment
- Automatic OpenAI-compatible endpoint generation
đ Multiple Integration Options
- MCP Protocol: Expose pipelines as MCP tools for use in AI development environments
- Open WebUI Integration: Use Hayhooks as a backend for Open WebUI with streaming support
- OpenAI Compatibility: Seamless integration with OpenAI-compatible tools and frameworks
đ§ Developer Friendly
- CLI for easy pipeline management
- Flexible configuration options
- Comprehensive logging and debugging support
- Custom route and middleware support
đ File Upload Support
- Built-in support for handling file uploads in pipelines
- Perfect for RAG systems and document processing
Next Steps
- Quick Start Guide - Get started with Hayhooks
- Installation - Install Hayhooks and dependencies
- Configuration - Configure Hayhooks for your needs
- Examples - Explore example implementations
Community & Support
- GitHub: deepset-ai/hayhooks
- Issues: GitHub Issues
- Documentation: Full Documentation
Hayhooks is actively maintained by the deepset team.
Related Servers
Scout Monitoring MCP
sponsorPut performance and error data directly in the hands of your AI assistant.
Alpha Vantage MCP Server
sponsorAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
MLflow MCP Server
Integrates with MLflow, enabling AI assistants to interact with experiments, runs, and registered models.
RefactorMCP
Automated refactoring tools for C# code transformation using Roslyn.
Enrichr MCP Server
Performs gene set enrichment analysis using the Enrichr API, supporting all available gene set libraries.
Svelte MCP
Official Svelte MCP server, provides docs and suggestions on the generated code.
Riza
Arbitrary code execution and tool-use platform for LLMs by Riza
Swagger/Postman MCP Server
Ingests and serves Swagger/OpenAPI specifications and Postman collections as MCP tools. Requires a config.json for API and authentication setup.
Choose MCP Server
An MCP server for integration with the Claude Desktop Client, with optional DBT manifest path configuration.
MCP to SLOP Adapter
A lightweight adapter connecting MCP clients with any SLOP compatible server.
MCP Crash Course
A simple demonstration of the MCP Python SDK.
Hello World MCP Server
A simple Hello World MCP server built with FastMCP, serving as a basic example.