Summary
Apache Airflow is an open-source platform used to programmatically author, schedule, and monitor workflows.
1
It has a modular architecture and uses a message queue to orchestrate an arbitrary number of workers, allowing for dynamic pipeline generation.
1
It provides plug-and-play operators that are ready to execute tasks on various third-party services, making it easy to apply to current infrastructure and extend to next-gen technologies.
1
It is one of the most robust platforms used by Data Engineers for orchestrating workflows or pipelines.
2
It allows users to easily visualize data pipelines’ dependencies, progress, logs, code, trigger tasks, and success status.
2
According to
See more results on Neeva
Summaries from the best pages on the web
Apache Airflow is an open-source workflow management platform for data engineering pipelines. It started at Airbnb in October 2014[2] as a solution to manage the company's increasingly complex workflows. Creating Airflow allowed Airbnb to programmatically author and schedule their workflows and monitor them via the built-in Airflow user interface.[3][4] From the beginning, the project was made open source, becoming an Apache Incubator project in March 2016 and a top-level Apache Software Foundation project in January 2019.
Apache Airflow - Wikipedia
wikipedia.org
Summary
Apache Airflow is an open-source tool to programmatically author, schedule, and monitor workflows. It is one of the most robust platforms used by Data Engineers for orchestrating workflows or pipelines. You can easily visualize your data pipelines’ dependencies, progress, logs, code, trigger tasks, and success status.
What is Apache Airflow? | Qubole
qubole.com
Learn about Apache Airflow, the open-source platform used by data professionals around the world to author, schedule, and manage workflows.
What is Apache Airflow? - Astronomer
astronomer.io