managing-astro-local-envот astronomer

Manage local Airflow development environment with Astro CLI commands. Start, stop, restart, and kill local Airflow containers; default credentials are admin/admin with webserver at http://localhost:8080 View logs for all components or specific services (scheduler, webserver) with real-time follow option Access container shells and run Airflow CLI commands directly via astro dev bash and astro dev run Troubleshoot common issues including port conflicts, startup failures, package errors, and...

npx skills add https://github.com/astronomer/agents --skill managing-astro-local-env

Astro Local Environment

This skill helps you manage your local Airflow environment using the Astro CLI.

Two modes: Docker (default, uses containers) and Standalone (Docker-free, uses a local venv — requires Airflow 3 + uv).

To set up a new project, see the setting-up-astro-project skill. When Airflow is running, use MCP tools from authoring-dags and testing-dags skills.


Start / Stop / Restart (Docker)

# Start local Airflow (webserver at http://localhost:8080)
astro dev start

# Stop containers (preserves data)
astro dev stop

# Kill and remove volumes (clean slate)
astro dev kill

# Restart all containers
astro dev restart

# Restart specific component
astro dev restart --scheduler
astro dev restart --webserver

Default credentials: admin / admin

Restart after modifying: requirements.txt, packages.txt, Dockerfile

Standalone mode? See the next section.


Standalone Mode

Docker-free local development. Runs Airflow directly on your machine in a .venv/ managed by uv.

Requirements: Airflow 3 (runtime 3.x), uv on PATH. Not supported on Windows.

Plain astro dev init already pins a runtime 3.x image, so no version flag is needed. See setting-up-astro-project for project initialization.

Start

# One-time: set standalone as default mode
astro config set dev.mode standalone

# Or use the flag per invocation
astro dev start --standalone
FlagDescription
--foreground / -fStream output in foreground
--port / -pOverride webserver port (default: 8080)
--no-proxyDisable reverse proxy

Stop / Kill / Restart

# Stop (preserves .venv)
astro dev stop

# Kill (removes .venv and .astro/standalone/ — clean slate)
astro dev kill

# Restart (preserves .venv for fast restart, use -k to kill first)
astro dev restart

If you used --standalone on start instead of setting the config, pass --standalone on every subsequent command too (stop, kill, restart, bash, run, logs, etc.).

State locations: venv in .venv/, database and logs in .astro/standalone/, DAGs from dags/.


Reverse Proxy

Run multiple Airflow projects locally without port conflicts. Works in both Docker and standalone modes.

Each project gets a hostname like <project-name>.localhost:6563. Visit http://localhost:6563 to see all active projects.

# Check proxy status and active routes
astro dev proxy status

# Force-stop proxy (auto-restarts on next astro dev start)
astro dev proxy stop
ConfigCommand
Change proxy portastro config set proxy.port <port>
Disable per-startastro dev start --no-proxy

Default proxy port: 6563


Check Status

astro dev ps

View Logs

# All logs
astro dev logs

# Specific component
astro dev logs --scheduler
astro dev logs --webserver

# Follow in real-time
astro dev logs -f

Standalone: astro dev logs works the same but shows a unified log (no per-component filtering).


Run Airflow CLI Commands

# Open a shell with Airflow environment
astro dev bash

# Run Airflow CLI commands
astro dev run airflow info
astro dev run airflow dags list

Standalone: Same commands work — bash opens a venv-activated shell, run executes in the venv.


Querying the Airflow API

Use astro api airflow to query a running local Airflow instance. Prefer operation IDs over URL paths.

Defaults: localhost:8080, admin/admin (auto-detected). Override with --api-url, --username, --password.

Discovery

# List all endpoints
astro api airflow ls

# Filter by keyword
astro api airflow ls dags
astro api airflow ls task

# Show params and schema for an operation
astro api airflow describe get_dag

Key Flags

FlagPurpose
-p key=valuePath parameters
-F key=valueBody/query fields (auto-converts booleans/numbers)
-q / --jqjq filter on response
--paginateFetch all pages
-X / --methodOverride HTTP method
--generateOutput curl command instead of executing

DAGs

# List all DAGs
astro api airflow get_dags

# Filter by pattern (SQL LIKE — use % wildcards)
astro api airflow get_dags -F dag_id_pattern=%etl%

# Get a specific DAG
astro api airflow get_dag -p dag_id=my_dag

# Get full details (schedule, params, etc.)
astro api airflow get_dag_details -p dag_id=my_dag

# Pause / unpause
astro api airflow patch_dag -p dag_id=my_dag -F is_paused=true
astro api airflow patch_dag -p dag_id=my_dag -F is_paused=false

# View DAG source code
astro api airflow get_dag_source -p dag_id=my_dag

# Check import errors
astro api airflow get_import_errors

DAG Runs

# List runs for a DAG
astro api airflow get_dag_runs -p dag_id=my_dag

# Trigger a run
astro api airflow trigger_dag_run -p dag_id=my_dag

# Trigger with config
astro api airflow trigger_dag_run -p dag_id=my_dag -F conf[key]=value

# Get a specific run
astro api airflow get_dag_run -p dag_id=my_dag -p dag_run_id=manual__2026-04-07

# Clear (re-run) a DAG run
astro api airflow clear_dag_run -p dag_id=my_dag -p dag_run_id=manual__2026-04-07 -F dry_run=false

Task Instances

# List task instances for a run
astro api airflow get_task_instances -p dag_id=my_dag -p dag_run_id=manual__2026-04-07

# Use ~ as wildcard (all DAGs or all runs)
astro api airflow get_task_instances -p dag_id=my_dag -p dag_run_id=~

# Get a specific task instance
astro api airflow get_task_instance -p dag_id=my_dag -p dag_run_id=manual__2026-04-07 -p task_id=extract

# Clear/retry failed tasks
astro api airflow post_clear_task_instances -p dag_id=my_dag \
  -F dag_run_id=manual__2026-04-07 -F only_failed=true -F dry_run=false

# Get task logs
astro api airflow get_log -p dag_id=my_dag -p dag_run_id=manual__2026-04-07 \
  -p task_id=extract -p try_number=1

Config & Connections

astro api airflow get_connections
astro api airflow get_variables
astro api airflow get_config

Filtering with jq

# List only DAG IDs
astro api airflow get_dags -q '.dags[].dag_id'

# Get failed task IDs from a run
astro api airflow get_task_instances -p dag_id=my_dag -p dag_run_id=~ \
  -q '[.task_instances[] | select(.state=="failed") | .task_id]'

Troubleshooting

IssueSolution
Port 8080 in useStop other containers or edit .astro/config.yaml
Container won't startastro dev kill then astro dev start
Package install failedCheck requirements.txt syntax
DAG not appearingRun astro dev parse to check for import errors
Out of disk spacedocker system prune
Standalone won't startEnsure uv is on PATH and runtime is 3.x
Proxy port conflictastro config set proxy.port <port>
.venv corruptedastro dev kill then astro dev start --standalone

Reset Environment

When things are broken:

astro dev kill
astro dev start

Upgrade Airflow

Test compatibility first

astro dev upgrade-test

Change version

  1. Edit Dockerfile:

    FROM quay.io/astronomer/astro-runtime:13.0.0
    
  2. Restart:

    astro dev kill && astro dev start
    

Related Skills

  • setting-up-astro-project: Initialize projects and configure dependencies
  • authoring-dags: Write DAGs (uses MCP tools, requires running Airflow)
  • testing-dags: Test DAGs (uses MCP tools, requires running Airflow)
  • deploying-airflow: Deploy DAGs to production (Astro, Docker Compose, Kubernetes)

Больше skills от astronomer

airflow
by astronomer
Query, manage, and troubleshoot Apache Airflow DAGs, runs, tasks, and system configuration. Supports 30+ commands across DAG inspection, run management, task logging, configuration queries, and direct REST API access Manage multiple Airflow instances with persistent configuration; auto-discover local and Astro deployments Trigger DAG runs synchronously (wait for completion) or asynchronously, diagnose failures, clear runs for retry, and access task logs with retry/map-index filtering Output...
airflow-hitl
by astronomer
Human approval gates, form inputs, and branching in Airflow DAGs using deferrable operators. Four operator types: ApprovalOperator for approve/reject decisions, HITLOperator for multi-option selection with forms, HITLBranchOperator for human-driven task routing, and HITLEntryOperator for form data collection All operators are deferrable, releasing worker slots while awaiting human response via Airflow UI's Required Actions tab or REST API Supports optional features including custom...
airflow-plugins
by astronomer
Build Airflow 3.1+ plugins that embed FastAPI apps, custom UI pages, React components, middleware, macros, and operator links directly into the Airflow UI. Use…
analyzing-data
by astronomer
Query your data warehouse to answer business questions with cached patterns and concept mappings. Supports pattern lookup and caching for repeated question types, with outcome recording to improve future queries Includes concept-to-table mapping cache and table schema discovery via INFORMATION_SCHEMA or codebase grep Provides run_sql() and run_sql_pandas() kernel functions returning Polars or Pandas DataFrames for analysis CLI commands for managing concept, pattern, and table caches, plus...
annotating-task-lineage
by astronomer
Annotate Airflow tasks with data lineage using inlets and outlets. Supports OpenLineage Dataset objects, Airflow Assets, and Airflow Datasets for defining inputs and outputs across databases, data warehouses, and cloud storage Use as a fallback when operators lack built-in OpenLineage extractors; follows a four-tier precedence system where custom extractors and OpenLineage methods take priority Includes dataset naming helpers for Snowflake, BigQuery, S3, and PostgreSQL to ensure consistent...
authoring-dags
by astronomer
Guided workflow for creating Apache Airflow DAGs with validation and testing integration. Structured six-phase approach: discover environment and existing patterns, plan DAG structure, implement following best practices, validate with af CLI commands, test with user consent, and iterate on fixes CLI commands for discovery ( af config connections , af config providers , af dags list ) and validation ( af dags errors , af dags get , af dags explore ) provide immediate feedback on DAG...
blueprint
by astronomer
Define reusable Airflow task group templates with Pydantic validation and compose DAGs from YAML. Use when creating blueprint templates, composing DAGs from…
checking-freshness
by astronomer
Verify data freshness by checking table timestamps and update patterns against a staleness scale. Identifies timestamp columns using common ETL naming patterns ( _loaded_at , _updated_at , created_at , etc.) and queries their maximum values to determine age Classifies data into four freshness statuses: Fresh (< 4 hours), Stale (4–24 hours), Very Stale (> 24 hours), or Unknown (no timestamp found) Provides SQL templates for checking last update time and row count trends over recent days to...

NotebookLM Web Importer

Импортируйте веб-страницы и видео YouTube в NotebookLM одним кликом. Более 200 000 пользователей доверяют нам.

Установить расширение Chrome