In this project, I will be using telemetry data and exploring how to perform batch processing using Python and a distributed database system, such as QuestDB. I will be dividing the dataset into smaller batches and processing them separately, in order to handle large amounts of data efficiently. By the end of this project, I will have a better understanding of how to use batch processing to efficiently manage and analyze large datasets using Python and a distributed database system.
To run the Docker Compose file, make sure that the Docker daemon is running on your machine and navigate to the directory where the docker-compose.yml file is located then run the following command:
- § docker-compose up -d
By running the command "docker-compose up", a docker image will be created for the QuestDB container and python script container as well. You can verify the launch of QuestDB container by typing "localhost:9000" in your browser, where you will be able to see the QuestDB's web-based query editor.
Start only the python container. The python script container will perform the necessary data transformation and loading into QuestDB.
- § docker-compose up python
By refreshing the page at 'localhost:9000' and running the command 'select * from test', you will see that the data has been successfully transferred.
- § docker stop questdb
- § docker-compose up questdb
- § docker-compose down