🤘 TT-NN operator library, and TT-Metalium low level kernel programming model.
-
Updated
Jun 7, 2024 - C++
🤘 TT-NN operator library, and TT-Metalium low level kernel programming model.
🏗️ Fine-tune, build, and deploy open-source LLMs easily!
An efficient, flexible and full-featured toolkit for fine-tuning LLM (InternLM2, Llama3, Phi3, Qwen, Mistral, ...)
Design, conduct and analyze results of AI-powered surveys and experiments. Simulate social science and market research with large numbers of AI agents and LLMs.
Private chat with local GPT with document, images, video, etc. 100% private, Apache 2.0. Supports oLLaMa, Mixtral, llama.cpp, and more. Demo: https://gpt.h2o.ai/ https://codellama.h2o.ai/
Multi-node production AI stack. Run the best of open source AI easily on your own servers. Create your own AI by fine-tuning open source models. Integrate LLMs with APIs. Run gptscript securely on the server
Firefly: 大模型训练工具,支持训练Qwen2、Yi1.5、Phi-3、Llama3、Gemma、MiniCPM、Yi、Deepseek、Orion、Xverse、Mixtral-8x7B、Zephyr、Mistral、Baichuan2、Llma2、Llama、Qwen、Baichuan、ChatGLM2、InternLM、Ziya2、Vicuna、Bloom等大模型
Easy "1-line" calling of all LLMs from OpenAI, MS Azure, AWS Bedrock, GCP Vertex, and Ollama
A snappy, keyboard-centric terminal user interface for interacting with large language models. Chat with ChatGPT, Claude, Llama 3, Phi 3, Mistral, Gemma and more.
GPT-4 level function calling models for real-world tool using use cases
Go package and example utilities for using Ollama / LLMs
UnOfficial Gradio Repo for ICML 2024 paper "Executable Code Actions Elicit Better LLM Agents" by Xingyao Wang, Yangyi Chen, Lifan Yuan, Yizhe Zhang, Yunzhu Li, Hao Peng, Heng Ji.
On-device LLM Inference Powered by X-Bit Quantization
combating the llm fomo, feeding the shiny object syndrome, for folly and partially for curiousity
ECE-5424 Advanced Machine Learning Final Project - LLM Prompt Recovery task
Add a description, image, and links to the mixtral topic page so that developers can more easily learn about it.
To associate your repository with the mixtral topic, visit your repo's landing page and select "manage topics."