Use HugginFace Transfromers and Pracetice Knowledge distillation, quantization, ONNX, ORT
-
Updated
Feb 10, 2023 - Jupyter Notebook
Use HugginFace Transfromers and Pracetice Knowledge distillation, quantization, ONNX, ORT
Deep Learning Head Pose Estimation using PyTorch
Transfer Learning for Neural Topic Models using Knowledge Distillation
A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆20 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
Knowledge distillation for masked FER using ResNet-18 in PyTorch.
Knowledge Distillation from VGG16 (teacher model) to MobileNet (student model)
<WIP>
El proyecto se centra en la destilación de conocimiento y técnicas de explicabilidad para mejorar el rendimiento de redes neuronales en imágenes naturales.
Knowledge Distillation for Skin Lesion Classification
Neural Network Compression
Online Knowledge Distillation using LipNet and an Italian dataset. Master's Thesis Project.
cViL: Cross-Lingual Training of Vision-Language Models using Knowledge Distillation
In this project we compare how deep learning models perform in dearth of labelled data using various methods currently used, we also propose a two-stage training pipeline for 60% better accuracy and 50% smaller model size.
Improving Quality of Multilingual Question Answering and Cross-Lingual Transfer using Multitask Learning, Knowledge Distillation, and Data Augmentation
Distillation of Efficient Dehazing Networks via Soft Knowledge
Use knowledge distillation to improve the capacity of multitask models for eye disease prediction
Simple Ensemble And Knowledge Distillation Framework For Natural Language Understanding
Implementation of ensemble method requires different models, to get different models it is better to have different pretrained model as initialising weight (seed weights ). In this repository a simple code has been implemented to generate such seed weights for ensembling.
Masters Thesis - Reproducible Knowledge Distillation on Graphs.
An basic Teacher Student Network From Scratch using ResNet50 on CIFAR10
Add a description, image, and links to the knowledge-distillation topic page so that developers can more easily learn about it.
To associate your repository with the knowledge-distillation topic, visit your repo's landing page and select "manage topics."