The Airflow Prometheus Exporter exposes various metrics about the Scheduler, DAGs and Tasks which helps improve the observability of an Airflow cluster.
The exporter is based on this prometheus exporter for Airflow.
The plugin has been tested with:
- Airflow >= 1.10.4
- Python 3.6+
The scheduler metrics assume that there is a DAG named canary_dag
. In our setup, the canary_dag
is a DAG which has a tasks which perform very simple actions such as establishing database connections. This DAG is used to test the uptime of the Airflow scheduler itself.
The exporter can be installed as an Airflow Plugin using:
pip install airflow-prometheus-exporter
This should ideally be installed in your Airflow virtualenv.
Metrics will be available at
http://<your_airflow_host_and_port>/admin/metrics/
Number of tasks with a specific status.
All the possible states are listed here.
Duration of successful tasks in seconds.
Number of times a particular task has failed.
value of configurable parameter in xcom table
xcom fields is deserialized as a dictionary and if key is found for a paticular task-id, the value is reported as a guage
Add task / key combinations in config.yaml:
xcom_params:
-
task_id: abc
key: count
-
task_id: def
key: errors
a task_id of 'all' will match against all airflow tasks:
xcom_params:
-
task_id: all
key: count
Number of DAGs with a specific status.
All the possible states are listed here
Duration of successful DagRun in seconds.
Scheduling delay for a DAG Run in seconds. This metric assumes there is a canary_dag
.
The scheduling delay is measured as the delay between when a DAG is marked as SCHEDULED
and when it actually starts RUNNING
.
Scheduling delay for a Task in seconds. This metric assumes there is a canary_dag
.
Number of tasks in the QUEUED
state at any given instance.