The Airflow scheduler executes your tasks on an array of workers while following the specified dependencies.
Programmatically author, schedule and monitor workflows.
Airflow pipelines are defined in Python, allowing for dynamic pipeline generation. This allows for writing code that instantiates pipelines dynamically.
Airflow has a modular architecture and uses a message queue to orchestrate an arbitrary number of workers. Airflow is ready to scale to infinity.
Easily define your own operators and extend libraries to fit the level of abstraction that suits your environment.
Airflow pipelines are lean and explicit. Parametrization is built into its core using the powerful Jinja templating engine.
Monitor, schedule and manage your workflows via a robust and modern web application.
Airflow is ready to scale to infinity.
Airflow is not a data streaming solution. Tasks do not move data from one to the other (though tasks can exchange metadata!).
Airflow is not in the Spark Streaming or Storm space, it is more comparable to Oozie or Azkaban. Workflows are expected to be mostly static or slowly changing. You can think of the structure of the tasks in your workflow as slightly more dynamic than a database structure would be.
Airflow workflows are expected to look similar from a run to the next, this allows for clarity around unit of work and continuity.