Skip to content

vikhyat/mixtral-inference

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

mixtral-inference

Inference code for the Mistral's "mixtral" 8x7B mixture of experts model. Largely based on the Mistral 7B inference repository. Requires ~100GB of VRAM.

Dependencies

PyTorch, SentencePiece, and xformers.

pip install -r requirements.txt

Usage

Assumes you have 8 CUDA devices. You can modify this near the bottom of main.py.

python main.py

About

inference code for mixtral-8x7b-32kseqlen

Resources

License

Stars

Watchers

Forks

Languages