An AI-powered server that generates PyAirbyte pipeline code and instructions using OpenAI and connector documentation.
The PyAirbyte Managed Code Provider (MCP) service is an AI-powered backend that generates PyAirbyte pipeline code and instructions. It leverages OpenAI and connector documentation to help users quickly scaffold and configure data pipelines between sources and destinations supported by Airbyte. The MCP service automates code generation, provides context-aware guidance, and streamlines the process of building and deploying data pipelines. If you want to learn more on how the service works check out this video.
The easiest way to get started is using our hosted MCP server. Add this to your Cursor MCP configuration file (.cursor/mcp.json
):
{
"mcpServers": {
"pyairbyte-mcp": {
"url": "https://pyairbyte-mcp-7b7b8566f2ce.herokuapp.com/mcp",
"env": {
"OPENAI_API_KEY": "your-openai-api-key-here"
}
}
}
}
Requirements:
Configuration Steps:
.cursor/mcp.json
in your project directory (for project-specific) or ~/.cursor/mcp.json
(for global access)Once configured, you can use the MCP server in your AI assistant by asking it to generate PyAirbyte pipelines.
Simply ask Cline to generate a PyAirbyte pipeline! Here are example prompts:
Basic Examples:
Generate a PyAirbyte pipeline from source-postgres to destination-snowflake
Create a pipeline to move data from source-github to dataframe
Build a PyAirbyte script for source-stripe to destination-bigquery
Generate a data pipeline from source-salesforce to destination-postgres
Create a pipeline that reads from source-github to a dataframe, and then visualize the results using Streamlit
Help me set up a data pipeline from source-salesforce to destination-postgres
source-postgres
, source-github
, source-stripe
, source-mysql
, source-salesforce
)destination-snowflake
, destination-bigquery
, destination-postgres
) OR dataframe
for Pandas analysisThe tool will automatically use your OpenAI API key (configured in the MCP settings) to generate enhanced, well-documented pipeline code with best practices and detailed setup instructions!
Just start by asking Cline to generate a pipeline for your specific use case! 🎯
Generates a complete PyAirbyte pipeline with setup instructions.
Parameters:
source_name
: The official Airbyte source connector name (e.g., 'source-postgres', 'source-github')destination_name
: The official Airbyte destination connector name (e.g., 'destination-postgres', 'destination-snowflake') OR 'dataframe' to output to Pandas DataFramesReturns:
A lightweight server to connect AI assistants with Kintone applications and data.
Access the NFTGo Developer API for comprehensive NFT data and analytics. Requires an NFTGo API key.
A read-only MCP server for querying live Snapchat Ads data using the CData JDBC Driver.
Immutable ledger database with live synchronization
Inspect database schemas and execute queries on Google BigQuery.
An MCP server for interacting with Azure Table Storage, requiring an Azure Storage connection string.
Create, manage, and update applications on InstantDB, the modern Firebase.
A server for storing and searching data in a VikingDB instance, configurable via command line or environment variables.
Access comprehensive B2B data on companies, employees, and job postings for your LLMs and AI workflows.
Interact with StarRocks