Learn to author, schedule and monitor data pipelines through practical examples using Apache Airflow
What you'll learn Create plugins to add functionalities to Apache Airflow. Using Docker with Airflow and different executors Master core functionalities such as DAGs, Operators, Tasks, Workflows, etc Understand and apply advanced concepts of Apache Airflow such as XCOMs, Branching and SubDAGs. The difference between Sequential, Local and Celery Executors, how do they work and how can you use them. Use Apache Airflow in a Big Data ecosystem with Hive, PostgreSQL, Elasticsearch etc. Install and configure Apache Airflow Think, answer and implement solutions using Airflow to real data processing problems Requirements Access to a personal computer where VirtualBox is installed, a VM has to be downloaded ( 5Go ) Some prior programming or scripting experience. Python experience will help you a lot but since it's a very easy language to learn, it shouldn't be too difficult if you are not familiar with. Description Apache Airflow is an open-source platform to programmatically author, schedule and monitor workflows. If you have many ETL(s) to manage, Airflow is a must-have. In this course you are going to learn how to master Apache Airflow through theory and pratical video courses. Starting from very basic notions such as, what is Airflow and how it works, we will dive into advanced concepts such as, how to create plugins and make real dynamic pipelines. Who this course is for: People being curious about data engineering. People who want to learn basic and advanced concepts about Apache Airflow. People who like hands-on approach.
Download link : (If you need these, buy and download immediately before they are delete)