ETL pipeline to parse character image from pictures and create dataset of English Characters
-
Updated
Mar 31, 2018 - Python
ETL pipeline to parse character image from pictures and create dataset of English Characters
A web app where an emergency worker can input a new message and get classification results in several categories. The web app will also display visualizations of the data.
ETL process which loads and transforms Medicare hospital data using Python and Hive
Workflow code for the CLIN project
Data Engineering techniques to analyze the data from Figure Eight to build a classification model for an API that classify the Disaster message into one of the given 36 categories and contact the particular category company to help these peoples.
Data modeling in a PostgreSQL database
Build disaster response pipelines with the message data from Figure Eight. A ETL pipeline was built to process the message data from major natural disasters around the world. A ML pipeline was built to categorize these events according to the messages. A web app was set up to visualize the data and classify the message after typing in.
As a recruiter, screening candidates can turn into a repetitive, time-consuming task. This GUI application was created as a viable solution! Today on Desktop, Tomorrow on the Cloud.
GCP App Engine ETL from publicly available API. (Home project to play with GCP products.)
This project focuses on the current job market for data-related jobs in four different countries: The United States (US), Canada, Australia, and Singapore. Using the Extraction, Loading, Transformation (ETL) methodology, the project will extract all data-related job postings from Indeed in these four countries, transform them, and load them into…
A repository concentrating on using High end parallel pipelines to perform ETL across various data sources
This application aims to provide US COVID-19 hospitalization related insights by State on front-end API service, to users based on their search. This project utilizes big data stores such as MongoDB, big Data technologies such as Apache Spark and a front end Flask Application. The reason behind using these technologies is to increase computation…
Resolvendo problema de uma empresa que realiza revenda de carros.
An ETL pipeline to efficiently extract, transform and load data into PostgresSQL table
Analyzing messaging data for a disaster response. The model generated in this project is deployed as a web app and can be used to categorize messages.
An ETL Pipeline using Apache Spark in Python
Classifying real messages that were sent during disaster events so that they can be sent to an appropriate disaster relief agency.
Amazon Reviews Metrics
Add a description, image, and links to the etl-pipeline topic page so that developers can more easily learn about it.
To associate your repository with the etl-pipeline topic, visit your repo's landing page and select "manage topics."