airflow tasks run <dag_id> <task_id> <execution_date_or_run_id>

Run a single task instance


dag_idThe id of the dag
task_idThe id of the task
execution_date_or_run_idThe execution_date of the DAG or run_id of the DAGRun


-h, --helpShow this help message and exit
--cfg-path <cfg_path>Path to config file to use instead of airflow.cfg
-f, --forceIgnore previous task instance state, rerun regardless if task already succeeded/failed
-A, --ignore-all-dependenciesIgnores all non-critical dependencies, including ignore_ti_state and ignore_task_deps
-i, --ignore-dependenciesIgnore task-specific dependencies, e.g. upstream, depends_on_past, and retry delay dependencies
-I, --ignore-depends-on-pastIgnore depends_on_past dependencies (but respect upstream dependencies)
-N, --interactiveDo not capture standard output and error streams (useful for interactive debugging)
-j, --job-id <job_id>
-l, --localRun the task using the LocalExecutor
--map-index <map_index>Mapped task index
-m, --mark-successMark jobs as succeeded without running them
-p, --pickle <pickle>Serialized pickle object of the entire dag (used internally)
--pool <pool>Resource pool to use
-r, --raw
--ship-dagPickles (serializes) the DAG and ships it to the worker
-S, --subdir <subdir>File location or directory from which to look for the dag. Defaults to '[AIRFLOW_HOME]/dags' where [AIRFLOW_HOME] is the value you set for 'AIRFLOW_HOME' config you set in 'airflow.cfg'