baichuan and baichuan2 finetuning and alpaca finetuning
-
Updated
Apr 21, 2024 - Python
baichuan and baichuan2 finetuning and alpaca finetuning
This model is a fine-tuned model based on the "TinyPixel/Llama-2-7B-bf16-sharded" model and "timdettmers/openassistant-guanaco" dataset
A Multimodal Approach to Convert Book Summaries into Artistic Book Covers
Implementation for fine-tuning a Falcon-7b model using QLoRA on the Spider dataset. The repository focuses on the task of converting natural language questions into SQL commands.
Our project addresses the challenge of multi-document summarization with Large Language Models (LLMs), which are constrained by token length limitations. We propose a novel approach that combines the strengths of LLMs and Maximal Marginal Relevance (MMR).
Kickstart with LLMs
Code for fine-tuning Llama2 LLM with custom text dataset to produce film character styled responses
A bash scripting assistant that helps you automate tasks. Powered by a streamlit chat interface, A finetuned nl2bash model generates bash code from natural language descriptions provided by the user
llama-2 model finetuned to generate docker commands
Lite Korean language model
Finetuning of Falcon-7b, ROC is an Average D&D player, present it a situation, it will explain the thought process of an average player.
peft model based on pythia-410m-deduped
Finetune Mistral 7b v1.0 on custom dataset
A simple custom QLoRA implementation for fine-tuning a language model (LLM) with basic tools such as PyTorch and Bitsandbytes, completely decoupled from Hugging Face.
A LLM challenge to (i) fine-tune pre-trained HuggingFace transformer model to build a Code Generation language model, and (ii) build a retrieval-augmented generation (RAG) application using LangChain
Fine-tuned FLAN T-5 using Instruction Fine-Tuning (Full), LoRA-based PEFT, and RLHF with PPO
Add a description, image, and links to the qlora topic page so that developers can more easily learn about it.
To associate your repository with the qlora topic, visit your repo's landing page and select "manage topics."