PyAirbyte
An AI-powered server that generates PyAirbyte pipeline code and instructions using OpenAI and connector documentation.
PyAirbyte MCP Server
What is the PyAirbyte MCP Service?
The PyAirbyte Managed Code Provider (MCP) service is an AI-powered backend that generates PyAirbyte pipeline code and instructions. It leverages OpenAI and connector documentation to help users quickly scaffold and configure data pipelines between sources and destinations supported by Airbyte. The MCP service automates code generation, provides context-aware guidance, and streamlines the process of building and deploying data pipelines. If you want to learn more on how the service works check out this video.
- Generates PyAirbyte pipeline code based on user instructions and connector documentation.
- Uses OpenAI and file search to provide context-aware code and instructions.
- Available as a remote MCP server for Cursor.
Quick Start
For Cursor
The easiest way to get started is using our hosted MCP server. Add this to your Cursor MCP configuration file (.cursor/mcp.json):
{
"mcpServers": {
"pyairbyte-mcp": {
"url": "https://pyairbyte-mcp-7b7b8566f2ce.herokuapp.com/mcp",
"env": {
"OPENAI_API_KEY": "your-openai-api-key-here"
}
}
}
}
Requirements:
- Your own OpenAI API key
- No local installation required
- Works immediately after configuration
Configuration Steps:
- Get your OpenAI API key from OpenAI Platform
- Create or edit
.cursor/mcp.jsonin your project directory (for project-specific) or~/.cursor/mcp.json(for global access) - Add the configuration above with your actual OpenAI API key
- turn off / on the MCP server
- Start generating PyAirbyte pipelines!
Security Note
- API keys are provided via MCP environment variables in the configuration
- This ensures secure API key handling through the MCP protocol
- Cursor is currently the only client that appears to support passing in ENV for remote servers. We will add Cline support as soon as available.
Usage
Once configured, you can use the MCP server in your AI assistant by asking it to generate PyAirbyte pipelines.
🚀 How to Use in Cline
1. Verify Connection
- Look for the MCP server status in Cline's interface
- You should see "pyairbyte-mcp" listed with 1 tool available
- If it shows 0 tools or is red, check your mcp.json. If you need more help, please ask in this slack channel.
2. Generate Pipelines with Natural Language
Simply ask Cline to generate a PyAirbyte pipeline! Here are example prompts:
Basic Examples:
Generate a PyAirbyte pipeline from source-postgres to destination-snowflake
Create a pipeline to move data from source-github to dataframe
Build a PyAirbyte script for source-stripe to destination-bigquery
Generate a data pipeline from source-salesforce to destination-postgres
Create a pipeline that reads from source-github to a dataframe, and then visualize the results using Streamlit
Help me set up a data pipeline from source-salesforce to destination-postgres
4. Available Source/Destination Options
- Sources: Any Airbyte source connector (e.g.,
source-postgres,source-github,source-stripe,source-mysql,source-salesforce) - Destinations: Any Airbyte destination connector (e.g.,
destination-snowflake,destination-bigquery,destination-postgres) ORdataframefor Pandas analysis
5. Pro Tips
- Use "dataframe" as destination if you want to analyze data in Python/Pandas
- Be specific about your source and destination names (use official Airbyte connector names and use source- or destination- to specify)
- Ask follow-up questions if you need help with specific configuration or setup
The tool will automatically use your OpenAI API key (configured in the MCP settings) to generate enhanced, well-documented pipeline code with best practices and detailed setup instructions!
Just start by asking Cline to generate a pipeline for your specific use case! 🎯
Features
- Automated Code Generation: Creates complete PyAirbyte pipeline scripts
- Configuration Management: Handles environment variables and credentials securely
- Documentation Integration: Uses OpenAI to provide context-aware instructions
- Multiple Output Formats: Supports both destination connectors and DataFrame output
- Best Practices: Includes error handling, logging, and proper project structure
- Generate pipeline for over 600 connectors: If it is in the Airbyte Connector Registry, the MCP server can create it.
Available Tools
generate_pyairbyte_pipeline
Generates a complete PyAirbyte pipeline with setup instructions.
Parameters:
source_name: The official Airbyte source connector name (e.g., 'source-postgres', 'source-github')destination_name: The official Airbyte destination connector name (e.g., 'destination-postgres', 'destination-snowflake') OR 'dataframe' to output to Pandas DataFrames
Returns:
- Complete Python pipeline code
- Setup and installation instructions
- Environment variable templates
- Best practices and usage guidelines
Похожие серверы
Canada's Food Guide
A nutrition analysis platform integrating Canada's Food Guide recipes with Health Canada's official nutrition databases.
CData eBay MCP Server
A read-only MCP server for querying live eBay data. Requires a separately licensed CData JDBC Driver for eBay.
Strapi MCP
An MCP server for Strapi CMS, providing access to content types and entries through the MCP protocol.
IMF Data MCP
Retrieve and process economic data from the International Monetary Fund (IMF) API, including datasets, time series, indicators, and countries.
MCP Memory libSQL
A persistent memory system for MCP using libSQL, providing vector search and efficient knowledge storage.
Act-On MCP Server by CData
A read-only MCP server that enables LLMs to query live Act-On data. Requires a separate CData JDBC Driver for Act-On.
Mem0 MCP
Integrates with Mem0.ai to provide persistent memory capabilities for LLMs, supporting cloud, Supabase, and local storage.
Trino MCP Server
A Go implementation of a Model Context Protocol (MCP) server for Trino, enabling LLM models to query distributed SQL databases through standardized tools.
AWS RDS Management
Manage Amazon RDS and Aurora database clusters, including instances, backups, parameters, costs, and monitoring.
Veeva MCP Server by CData
A read-only MCP server by CData that enables LLMs to query live data from Veeva.