Data Engineering: Extract Transform Load (ETL) Programming with Apache Airflow - Industry Applications
From 35 € /h
This course provides a hands-on approach to mastering Apache Airflow, a powerful open-source workflow automation tool widely used in ETL (Extract, Transform, Load) processes. You will learn how to build, schedule, monitor, and optimize data pipelines, ensuring data reliability and efficiency in production environments. Through real-world examples and projects, you will gain the skills required to become a proficient Data Engineer capable of handling complex workflows in modern data ecosystems.
This course will allow you to build multiple real-world ETL workflows using Apache Airflow. Let me know if you would like additional information! 🚀
This course will allow you to build multiple real-world ETL workflows using Apache Airflow. Let me know if you would like additional information! 🚀
Extra information
Bring your own laptop
Location
At student's location :
- Around Brussels, Belgium
Online from Belgium
About Me
I am a dedicated software engineer with a Master's and Ph.D. in Computer Science from ULB University. With years of experience in both academia and industry, I specialize in helping students understand complex concepts in programming, algorithms, and software development. My teaching approach is student-centered, fostering critical thinking and problem-solving skills tailored to individual learning styles.
Whether you are a beginner looking to write your first line of code or an advanced learner diving into data structures, machine learning, or system design, I am here to guide you every step of the way.
Whether you are a beginner looking to write your first line of code or an advanced learner diving into data structures, machine learning, or system design, I am here to guide you every step of the way.
Education
- PhD in computer Science from the Université Libre de Bruxelles (ULB), Brussels, Belgium
- Specialized in Big Data Management, Geographical Data Management, Databases, DevOps automation tools.
- Specialized in Big Data Management, Geographical Data Management, Databases, DevOps automation tools.
Experience / Qualifications
Database: SQL Server, PostgreSQL, MySQL
Data Science and Data Engineering tools: Python, GenAI, Data Lake, DBT, Airflow
Big Data tools: Hadoop, Spark
Cloud: AWS and Azure Services
DevOps tools: CI/CD pipelines, GitHub, Docker, Kubernetes,
Data Science and Data Engineering tools: Python, GenAI, Data Lake, DBT, Airflow
Big Data tools: Hadoop, Spark
Cloud: AWS and Azure Services
DevOps tools: CI/CD pipelines, GitHub, Docker, Kubernetes,
Age
Adults (18-64 years old)
Student level
Beginner
Intermediate
Advanced
Duration
60 minutes
The class is taught in
English
Dutch
Arabic
Skills
Reviews
Availability of a typical week
(GMT -05:00)
New York
Mon
Tue
Wed
Thu
Fri
Sat
Sun
00-04
04-08
08-12
12-16
16-20
20-24
In this course, you will learn how to efficiently package, containerize, and deploy Python applications and microservices using Docker. The course covers fundamental Docker concepts, best practices for structuring Python projects, and strategies for building scalable and portable applications. Through hands-on projects, you will gain practical experience in creating Docker images, managing containers, and orchestrating microservices, enabling seamless deployment across different environments.
Contact me if you want to have more information about the course!
Contact me if you want to have more information about the course!
Whether you are a beginner writing your first lines of code or an advanced learner tackling complex concepts, I tailor lessons to suit your needs. Together, we will focus on practical skills, problem-solving, and real-world projects to make programming intuitive and rewarding. Let’s work together to turn your goals into achievements!
Show more
Good-fit Instructor Guarantee





