A framework for large scale recommendation algorithms.
-
Updated
Jun 7, 2024 - Python
A framework for large scale recommendation algorithms.
Repository for Project Insight: NLP as a Service
I will implement Fastai in each projects present in this repository.
Federated Learning Utilities and Tools for Experimentation
CLIP (Contrastive Language–Image Pre-training) for Italian
[NeurIPS 2023] Michelangelo: Conditional 3D Shape Generation based on Shape-Image-Text Aligned Latent Representation
Pytorch implementation of image captioning using transformer-based model.
A compilation of the best multi-agent papers
Symbolic music generation taking inspiration from NLP and human composition process
A Transformer Implementation that is easy to understand and customizable.
This project investigates the security of large language models by performing binary classification of a set of input prompts to discover malicious prompts. Several approaches have been analyzed using classical ML algorithms, a trained LLM model, and a fine-tuned LLM model.
Retrieval-based Voice Conversion (RVC) implemented with Hugging Face Transformers.
Public repo for the paper: "Modeling Intensification for Sign Language Generation: A Computational Approach" by Mert Inan*, Yang Zhong*, Sabit Hassan*, Lorna Quandt, Malihe Alikhani
AI/ML using transformers, reinforcement learning, deep learning models and related frameworks
CHARacter-awaRE Diffusion: Multilingual Character-Aware Encoders for Font-Aware Diffusers That Can Actually Spell
[TMI 2023] XBound-Former: Toward Cross-scale Boundary Modeling in Transformers
Neural Persian Poet: A sequence-to-sequence model for composing Persian poetry
An autoML for explainable text classification.
Train a T5 model to generate simple Fake News and use a RoBERTa model to classify what's fake and what's real.
The special repository to demonstrate how you can use transformers for Swahili text classification
Add a description, image, and links to the transformers-models topic page so that developers can more easily learn about it.
To associate your repository with the transformers-models topic, visit your repo's landing page and select "manage topics."