authoring-dagsद्वारा astronomer

Guided workflow for creating Apache Airflow DAGs with validation and testing integration. Structured six-phase approach: discover environment and existing patterns, plan DAG structure, implement following best practices, validate with af CLI commands, test with user consent, and iterate on fixes CLI commands for discovery ( af config connections , af config providers , af dags list ) and validation ( af dags errors , af dags get , af dags explore ) provide immediate feedback on DAG...

npx skills add https://github.com/astronomer/agents --skill authoring-dags

DAG Authoring Skill

This skill guides you through creating and validating Airflow DAGs using best practices and af CLI commands.

For testing and debugging DAGs, see the testing-dags skill which covers the full test -> debug -> fix -> retest workflow.


Running the CLI

These commands assume af is on PATH. Run via astro otto to get it automatically, or install standalone with uv tool install astro-airflow-mcp.


Workflow Overview

+-----------------------------------------+
| 1. DISCOVER                             |
|    Understand codebase & environment    |
+-----------------------------------------+
                 |
+-----------------------------------------+
| 2. PLAN                                 |
|    Propose structure, get approval      |
+-----------------------------------------+
                 |
+-----------------------------------------+
| 3. IMPLEMENT                            |
|    Write DAG following patterns         |
+-----------------------------------------+
                 |
+-----------------------------------------+
| 4. VALIDATE                             |
|    Check import errors, warnings        |
+-----------------------------------------+
                 |
+-----------------------------------------+
| 5. TEST (with user consent)             |
|    Trigger, monitor, check logs         |
+-----------------------------------------+
                 |
+-----------------------------------------+
| 6. ITERATE                              |
|    Fix issues, re-validate              |
+-----------------------------------------+

Phase 1: Discover

Before writing code, understand the context.

Explore the Codebase

Use file tools to find existing patterns:

  • Glob for **/dags/**/*.py to find existing DAGs
  • Read similar DAGs to understand conventions
  • Check requirements.txt for available packages

Query the Airflow Environment

Use af CLI commands to understand what's available:

CommandPurpose
af config connectionsWhat external systems are configured
af config variablesWhat configuration values exist
af config providersWhat operator packages are installed
af config versionVersion constraints and features
af dags listExisting DAGs and naming conventions
af config poolsResource pools for concurrency

Example discovery questions:

  • "Is there a Snowflake connection?" -> af config connections
  • "What Airflow version?" -> af config version
  • "Are S3 operators available?" -> af config providers

Phase 2: Plan

Based on discovery, propose:

  1. DAG structure - Tasks, dependencies, schedule
  2. Operators to use - Based on available providers
  3. Connections needed - Existing or to be created
  4. Variables needed - Existing or to be created
  5. Packages needed - Additions to requirements.txt

Get user approval before implementing.


Phase 3: Implement

Write the DAG following best practices (see below). Key steps:

  1. Create DAG file in appropriate location
  2. Update requirements.txt if needed
  3. Save the file

Phase 4: Validate

Use af CLI as a feedback loop to validate your DAG.

Step 1: Check Import Errors

After saving, check for parse errors (Airflow will have already parsed the file):

af dags errors
  • If your file appears -> fix and retry
  • If no errors -> continue

Common causes: missing imports, syntax errors, missing packages.

Step 2: Verify DAG Exists

af dags get <dag_id>

Check: DAG exists, schedule correct, tags set, paused status.

Step 3: Check Warnings

af dags warnings

Look for deprecation warnings or configuration issues.

Step 4: Explore DAG Structure

af dags explore <dag_id>

Returns in one call: metadata, tasks, dependencies, source code.

On Astro

If you're running on Astro, you can also validate locally before deploying:

  • Parse check: Run astro dev parse to catch import errors and DAG-level issues without starting a full Airflow environment
  • DAG-only deploy: Once validated, use astro deploy --dags for fast DAG-only deploys that skip the Docker image build — ideal for iterating on DAG code

Phase 5: Test

See the testing-dags skill for comprehensive testing guidance.

Once validation passes, test the DAG using the workflow in the testing-dags skill:

  1. Get user consent -- Always ask before triggering
  2. Trigger and wait -- af runs trigger-wait <dag_id> --timeout 300
  3. Analyze results -- Check success/failure status
  4. Debug if needed -- af runs diagnose <dag_id> <run_id> and af tasks logs <dag_id> <run_id> <task_id>

Quick Test (Minimal)

# Ask user first, then:
af runs trigger-wait <dag_id> --timeout 300

For the full test -> debug -> fix -> retest loop, see testing-dags.


Phase 6: Iterate

If issues found:

  1. Fix the code
  2. Check for import errors: af dags errors
  3. Re-validate (Phase 4)
  4. Re-test using the testing-dags skill workflow (Phase 5)

CLI Quick Reference

PhaseCommandPurpose
Discoveraf config connectionsAvailable connections
Discoveraf config variablesConfiguration values
Discoveraf config providersInstalled operators
Discoveraf config versionVersion info
Validateaf dags errorsParse errors (check first!)
Validateaf dags get <dag_id>Verify DAG config
Validateaf dags warningsConfiguration warnings
Validateaf dags explore <dag_id>Full DAG inspection

Testing commands -- See the testing-dags skill for af runs trigger-wait, af runs diagnose, af tasks logs, etc.


Best Practices & Anti-Patterns

For code patterns and anti-patterns, see reference/best-practices.md.

Read this reference when writing new DAGs or reviewing existing ones. It covers what patterns are correct (including Airflow 3-specific behavior) and what to avoid.


Related Skills

  • testing-dags: For testing DAGs, debugging failures, and the test -> fix -> retest loop
  • debugging-dags: For troubleshooting failed DAGs
  • deploying-airflow: For deploying DAGs to production (Astro or open-source)
  • migrating-airflow-2-to-3: For migrating DAGs to Airflow 3

astronomer की और Skills

airflow
by astronomer
Query, manage, and troubleshoot Apache Airflow DAGs, runs, tasks, and system configuration. Supports 30+ commands across DAG inspection, run management, task logging, configuration queries, and direct REST API access Manage multiple Airflow instances with persistent configuration; auto-discover local and Astro deployments Trigger DAG runs synchronously (wait for completion) or asynchronously, diagnose failures, clear runs for retry, and access task logs with retry/map-index filtering Output...
airflow-hitl
by astronomer
Human approval gates, form inputs, and branching in Airflow DAGs using deferrable operators. Four operator types: ApprovalOperator for approve/reject decisions, HITLOperator for multi-option selection with forms, HITLBranchOperator for human-driven task routing, and HITLEntryOperator for form data collection All operators are deferrable, releasing worker slots while awaiting human response via Airflow UI's Required Actions tab or REST API Supports optional features including custom...
airflow-plugins
by astronomer
Build Airflow 3.1+ plugins that embed FastAPI apps, custom UI pages, React components, middleware, macros, and operator links directly into the Airflow UI. Use…
analyzing-data
by astronomer
Query your data warehouse to answer business questions with cached patterns and concept mappings. Supports pattern lookup and caching for repeated question types, with outcome recording to improve future queries Includes concept-to-table mapping cache and table schema discovery via INFORMATION_SCHEMA or codebase grep Provides run_sql() and run_sql_pandas() kernel functions returning Polars or Pandas DataFrames for analysis CLI commands for managing concept, pattern, and table caches, plus...
annotating-task-lineage
by astronomer
Annotate Airflow tasks with data lineage using inlets and outlets. Supports OpenLineage Dataset objects, Airflow Assets, and Airflow Datasets for defining inputs and outputs across databases, data warehouses, and cloud storage Use as a fallback when operators lack built-in OpenLineage extractors; follows a four-tier precedence system where custom extractors and OpenLineage methods take priority Includes dataset naming helpers for Snowflake, BigQuery, S3, and PostgreSQL to ensure consistent...
blueprint
by astronomer
Define reusable Airflow task group templates with Pydantic validation and compose DAGs from YAML. Use when creating blueprint templates, composing DAGs from…
checking-freshness
by astronomer
Verify data freshness by checking table timestamps and update patterns against a staleness scale. Identifies timestamp columns using common ETL naming patterns ( _loaded_at , _updated_at , created_at , etc.) and queries their maximum values to determine age Classifies data into four freshness statuses: Fresh (< 4 hours), Stale (4–24 hours), Very Stale (> 24 hours), or Unknown (no timestamp found) Provides SQL templates for checking last update time and row count trends over recent days to...
cosmos-dbt-core
by astronomer
Convert dbt Core projects into Airflow DAGs or TaskGroups using Astronomer Cosmos. Supports three assembly patterns: standalone DbtDag, DbtTaskGroup within existing DAGs, and individual Cosmos operators for fine-grained control Choose from eight execution modes (WATCHER, LOCAL, VIRTUALENV, KUBERNETES, AIRFLOW_ASYNC, and others) based on isolation and performance needs Offers three parsing strategies (dbt_manifest, dbt_ls, dbt_ls_file, automatic) to balance speed and selector complexity...

NotebookLM Web Importer

एक क्लिक में वेब पेज और YouTube वीडियो NotebookLM में आयात करें। 200,000+ उपयोगकर्ताओं द्वारा विश्वसनीय।

Chrome एक्सटेंशन इंस्टॉल करें