A successful pipeline moves data efficiently, minimizing pauses and blockages between tasks, keeping every process along the way operational. Apache Airflow provides a single customizable environment for building and managing data pipelines, eliminating the need for a hodge-podge collection of tools, snowflake code, and homegrown processes. Using real-world scenarios and examples, Data Pipelines with Apache Airflow teaches you how to simplify and automate data pipelines, reduce operational overhead, and smoothly integrate all the technologies in your stack.
A great introduction to Apache Airflow. It's well-written and well thought out.
3 of 15 chapters available
placing your order...Don't refresh or navigate away from the page.
Easy to read and provides a range of tips to be productive with Airflow.
Covers all the necessary to start creating your own pipelines, plus well elaborated examples and "real-life" scenarios.
Explains everything about how to define and schedule tasks to build a production-ready DAG.
A great intro to what a DAG is and the basics behind its architecture.