Building an Airflow Pipeline That Talks to AWS — Data Pipelines in the Cloud (III)

Building an Airflow Pipeline That Talks to AWS — Data Pipelines in the Cloud (III)
This tutorial is a complete guide to building an end-to-end data pipeline with Apache Airflow that communicates with AWS services like RDS (relational database) and S3 (object storage) to perform data transformations automatically and efficiently.
Read more →

Using Amazon Web Services (AWS) with the Command Line — Data Pipelines in the Cloud (II)

Using Amazon Web Services (AWS) with the Command Line — Data Pipelines in the Cloud (II)
Welcome back to the ‘Data Pipelines in the Cloud’ series! In the first part, I introduced Airflow as a tool for orchestrating data pipelines and demonstrated how to code and execute a minimal Airflow pipeline (DAG) on your local environment. In this second part, we’ll lay the ground to build a more functional Airflow DAG by using the AWS Command Line Interface to set up a relational database in the cloud (PostgreSQL), along with a bucket for object storage (S3). We’ll then upload a sample CSV file to the bucket, which we’ll later use as input for an Airflow DAG that performs a meaningful transformation on this data.
Read more →

A Beginner’s Introduction to Airflow with Docker — Data Pipelines in the Cloud (I)

A Beginner's Introduction to Airflow with Docker — Data Pipelines in the Cloud (I)
My attempt of using Stable Diffussion to depict something cloud-computery.
Learn the essentials of Apache Airflow for creating scalable and automated data pipelines in the cloud with this comprehensive, step-by-step beginner’s guide. Discover what problem Airflow solves and under what circumstances is better to use it and run your first Airflow DAG on Docker with the Linux subsystem for Windows.
Read more →
Mastodon