setting-up-astro-projecttarafından astronomer

Initialize and configure Astro/Airflow projects with dependencies, connections, and environment setup. Scaffolds complete project structure with astro dev init , including directories for DAGs, plugins, tests, and configuration files Manage Python and OS-level dependencies via requirements.txt and packages.txt , with custom Dockerfile support for complex setups Configure connections, variables, and pools declaratively in airflow_settings.yaml , with export/import commands for environment...

npx skills add https://github.com/astronomer/agents --skill setting-up-astro-project

Astro Project Setup

This skill helps you initialize and configure Airflow projects using the Astro CLI.

To run the local environment, see the managing-astro-local-env skill. To write DAGs, see the authoring-dags skill. Open-source alternative: If the user isn't on Astro, guide them to Apache Airflow's Docker Compose quickstart for local dev and the Helm chart for production. For deployment strategies, use the deploying-airflow skill.


Initialize a New Project

astro dev init

Don't pass --airflow-version or --runtime-version unless the user explicitly asks for a specific pin. Plain astro dev init resolves to the latest Astro Runtime — that's the right default. Specifying a version risks pinning to a stale value from training data. If the user wants to know what was installed, read the generated Dockerfile afterward instead of guessing.

Creates this structure:

project/
├── dags/                # DAG files
├── include/             # SQL, configs, supporting files
├── plugins/             # Custom Airflow plugins
├── tests/               # Unit tests
├── Dockerfile           # Image customization
├── packages.txt         # OS-level packages
├── requirements.txt     # Python packages
└── airflow_settings.yaml # Connections, variables, pools

Adding Dependencies

Python Packages (requirements.txt)

apache-airflow-providers-snowflake==5.3.0
pandas==2.1.0
requests>=2.28.0

OS Packages (packages.txt)

gcc
libpq-dev

Custom Dockerfile

For complex setups (private PyPI, custom scripts):

FROM quay.io/astronomer/astro-runtime:12.4.0

RUN pip install --extra-index-url https://pypi.example.com/simple my-package

After modifying dependencies: Run astro dev restart


Configuring Connections & Variables

airflow_settings.yaml

Loaded automatically on environment start:

airflow:
  connections:
    - conn_id: my_postgres
      conn_type: postgres
      host: host.docker.internal
      port: 5432
      login: user
      password: pass
      schema: mydb

  variables:
    - variable_name: env
      variable_value: dev

  pools:
    - pool_name: limited_pool
      pool_slot: 5

Export/Import

# Export from running environment
astro dev object export --connections --file connections.yaml

# Import to environment
astro dev object import --connections --file connections.yaml

Validate Before Running

Parse DAGs to catch errors without starting the full environment:

astro dev parse

Related Skills

  • managing-astro-local-env: Start, stop, and troubleshoot the local environment
  • authoring-dags: Write and validate DAGs (uses MCP tools)
  • testing-dags: Test DAGs (uses MCP tools)
  • deploying-airflow: Deploy DAGs to production (Astro, Docker Compose, Kubernetes)

astronomer tarafından daha fazla skill

airflow
by astronomer
Query, manage, and troubleshoot Apache Airflow DAGs, runs, tasks, and system configuration. Supports 30+ commands across DAG inspection, run management, task logging, configuration queries, and direct REST API access Manage multiple Airflow instances with persistent configuration; auto-discover local and Astro deployments Trigger DAG runs synchronously (wait for completion) or asynchronously, diagnose failures, clear runs for retry, and access task logs with retry/map-index filtering Output...
airflow-hitl
by astronomer
Human approval gates, form inputs, and branching in Airflow DAGs using deferrable operators. Four operator types: ApprovalOperator for approve/reject decisions, HITLOperator for multi-option selection with forms, HITLBranchOperator for human-driven task routing, and HITLEntryOperator for form data collection All operators are deferrable, releasing worker slots while awaiting human response via Airflow UI's Required Actions tab or REST API Supports optional features including custom...
airflow-plugins
by astronomer
Build Airflow 3.1+ plugins that embed FastAPI apps, custom UI pages, React components, middleware, macros, and operator links directly into the Airflow UI. Use…
analyzing-data
by astronomer
Query your data warehouse to answer business questions with cached patterns and concept mappings. Supports pattern lookup and caching for repeated question types, with outcome recording to improve future queries Includes concept-to-table mapping cache and table schema discovery via INFORMATION_SCHEMA or codebase grep Provides run_sql() and run_sql_pandas() kernel functions returning Polars or Pandas DataFrames for analysis CLI commands for managing concept, pattern, and table caches, plus...
annotating-task-lineage
by astronomer
Annotate Airflow tasks with data lineage using inlets and outlets. Supports OpenLineage Dataset objects, Airflow Assets, and Airflow Datasets for defining inputs and outputs across databases, data warehouses, and cloud storage Use as a fallback when operators lack built-in OpenLineage extractors; follows a four-tier precedence system where custom extractors and OpenLineage methods take priority Includes dataset naming helpers for Snowflake, BigQuery, S3, and PostgreSQL to ensure consistent...
authoring-dags
by astronomer
Guided workflow for creating Apache Airflow DAGs with validation and testing integration. Structured six-phase approach: discover environment and existing patterns, plan DAG structure, implement following best practices, validate with af CLI commands, test with user consent, and iterate on fixes CLI commands for discovery ( af config connections , af config providers , af dags list ) and validation ( af dags errors , af dags get , af dags explore ) provide immediate feedback on DAG...
blueprint
by astronomer
Define reusable Airflow task group templates with Pydantic validation and compose DAGs from YAML. Use when creating blueprint templates, composing DAGs from…
checking-freshness
by astronomer
Verify data freshness by checking table timestamps and update patterns against a staleness scale. Identifies timestamp columns using common ETL naming patterns ( _loaded_at , _updated_at , created_at , etc.) and queries their maximum values to determine age Classifies data into four freshness statuses: Fresh (< 4 hours), Stale (4–24 hours), Very Stale (> 24 hours), or Unknown (no timestamp found) Provides SQL templates for checking last update time and row count trends over recent days to...

NotebookLM Web Importer

Web sayfalarını ve YouTube videolarını tek tıkla NotebookLM'e aktarın. 200.000'den fazla kullanıcı tarafından güveniliyor.

Chrome Eklentisini Yükle