An end-to-end GoodReads Data Pipeline for Building Data Lake, Data Warehouse and Analytics Platform.
-
Updated
Jan 8, 2024 - Python
An end-to-end GoodReads Data Pipeline for Building Data Lake, Data Warehouse and Analytics Platform.
This is an ELT data pipeline setup to track the activities of an e-commerce website based on orders, reviews, deliveries and shipment date. This project utilized technologies like Airflow, AWS RDS-Postgres, Python etc.
Build a data warehouse from scratch, including full load, daily incremental load, design schema, SCD Type 1 and 2.
Datitos - TP2 with steroids
Data Engineering Projects on data modelling, data warehousing, data lake development, orchestration and analysis
Creation of the almost-real time data processing pipeline for the Pintrest posts.
Small project to play around Apache Airflow and ETL
Apache Airflow demo project that setup 3 DAGs to explain how to pass parameters from a DAG to a triggered DAG.
Orchestrate data pipeline using airflow
A dashboard with sentiment scores of the tweets
An airflow DAG transformation framework
This project demonstrates how to build and automate an ETL pipeline written in Python and schedule it using open source Apache Airflow orchestration tool on AWS EC2 instance.
Фабрика DAG
Analysing live tweets from twitter by generating a big data pipeline and scheduling it with Airflow (Using also Kafka for tweet ingestion, Cassandra for storing parsed tweets, and Spark for Analysis)
This repository is no longer maintained.
An end-to-end GoodReads Data Pipeline for Building Data Lake, Data Warehouse and Analytics Platform.
Add a description, image, and links to the airflow-dag topic page so that developers can more easily learn about it.
To associate your repository with the airflow-dag topic, visit your repo's landing page and select "manage topics."