This repository contains a number of experiments with Multi Lingual Transformer models (Multi-Lingual BERT, DistilBERT, XLM-RoBERTa, mT5 and ByT5) focussed on the Dutch language.
-
Updated
Oct 30, 2023 - Python
This repository contains a number of experiments with Multi Lingual Transformer models (Multi-Lingual BERT, DistilBERT, XLM-RoBERTa, mT5 and ByT5) focussed on the Dutch language.
In this research, the performance of three state-of-the-art models was compared for a machine translation task. The codes used for implementing the models in this project were sourced from the GitHub repository located at https://github.com/JoyeBright/MT-HF.git.
License plate localizer using pre-trained YOLOv5, combined with text extraction using pre-trained TrOCR
Comparing between residual stream and highway stream in transformers(BERT) .
This code can be used to implement a text search algorithm. Specifically, given an input text it returns a summary of the most relevant paragraph from a given list of paragraphs.
This is a Streamlit app that has the ability to paraphrase input text, which means it modifies the sentence to bear the same meaning, but different structure and word usage. It uses Huggingface Text-to-text Transfer Transformers (T5) as the base model.
A financial dashboard built using Streamlit, fine-tuned Transformers models and Prophet. Includes auto summarisation, sentiment analysis, and trend forecasting of stock and crypto news.
Convert the normal maps used in the game Transformers: Fall of Cybertron to the Mikk format
(Re)implementations of NNs in Keras from Scratch.
Learning about transformers by following along to Umar Jamil
Minimalistic PyTorch implementation of transformer
Premise Selection using OEIS portal
Text-To-Text Textbots to Demonstrate Output Differences Between Models Trained on Filtered/Unfiltered Datasets for HSS4 - The Modern Context: Select Figures and Topics
Simple Deep Learning model for training image classifiers using the BEiT method
An implementation of the GPT(generative pretrained transformer) model, from scratch, which produces Shakespearean text by training on the dialogues written by Shakespeare along with the GPT Encoder.
This project is meant to generate a Local Language Model based on textual input.
LSTM models for text classification on character embeddings.
Fine-tuned Transformer models from Hugging Face
Advanced NLP project where we needed to build a Sarcasm/Irony classifier. It has many methods like BiLSTM, Transformers to BERT/ T5 / MPNET finetuning
This repository is dedicated to small projects and some theoretical material that I used to get into NLP and LLM in a practical and efficient way.
Add a description, image, and links to the transformers-models topic page so that developers can more easily learn about it.
To associate your repository with the transformers-models topic, visit your repo's landing page and select "manage topics."