Skip to content

This repo gives an introduction to how to make full working example to serve your model using asynchronous Celery tasks and FastAPI. πŸ”₯ πŸ”₯ πŸ”₯ πŸ”₯

Notifications You must be signed in to change notification settings

dnguyenngoc/ml-models-in-production

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

33 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Ml Models in Production

python Celery rabbitmq redis react fastapi NPM Tensorflow

This repo gives an introduction to how to make full working example to serve your model using asynchronous Celery tasks and FastAPI. This post walks through a working example for serving a ML model using Celery and FastAPI. All code can be found in this repository. We won’t specifically discuss the ML model used for this example however it was trained using coco dataset with 80 object class like cat, dog, bird ... more detail here Coco Dataset. The model have been train with tensorflow Tensorflow

Contents

Screenshots & Gifs

View System

Architecture

Demo

1. Install docker and docker-compose

https://www.docker.com/

2. Pull git repo

git clone https://github.com/apot-group/ml-models-in-production.git

3. Start Server

cd ml-models-in-production && docker-compose up

Service URL
API docs http://localhost/api/docs
Demo Web http://localhost

go to Demo web http://localhost and test with your picture.

Test

Contact Us