New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
No package metadata was found for bitsandbytes #2249
Comments
Hi @Lagstill, sorry you're facing this issue. It looks like this is more related to Otherwise, could you explain more But while loading in the model card, in HF Website I get No package metadata was found for bitsandbytes in Lagstill/Varsity_module2_bot. Is this in your browser? If yes, can you share a screenshot? Or is this in a script and in this case what is the script and the full stacktrace? |
i have the same issue. any solutions? |
Same answer as what I mentioned above, I would need more context to help you investigate the issue. Also worth asking directly on |
I was fine-tuning a LLM model and I installed all the required libraries including bits and bytes. After I completed the fine-tuning I pushed the model to the huggingface hub. Once pushed to the huggingface hub it should have been straightforward like other model cards but when using Inferene API on the model that I pushed it says No package metadata was found for bitsandbytes. so like in the image above when you click compute the problem arises as the model has been already pushed into the hugging face hub, Hope this clear you what the problem is. IF you need further information https://discuss.huggingface.co/t/inference-api-for-fine-tuned-model-not-working-no-package-metadata-was-found-for-bitsandbytes/75457 here is a full discussion on huggingface forum which hasn't been solved yet as of now. |
Thanks for clarification @jeevanchaps. I reported it internally. I'm linking the PR that should fix it https://github.com/huggingface/api-inference/pull/1902/ (private repo). I'll let you know once it's deployed. |
Hi @Wauplin I have another concern in the hugging face as well. I used this Sakonii/distilbert-base-nepali model to fine-tune my custom dataset and then pushed it to the hugging face hub. When I try to use the inference API on huggingface hub I get the error "Can't load tokenizer using from_pretrained, please update its configuration: data did not match any variant of untagged enum PyPreTokenizerTypeWrapper at line 83 column 3". Any ideas or solutions for it? Using this model as a pipeline on local works perfectly fine but in the huggingface hub inference it gives error as I mentioned above. The model card that I pushed my model is xap/Sentiment_Analysis_NepaliCovidTweets |
Describe the bug
Used the following:
and
model.push_to_hub('Lagstill/Varsity_module2_bot')
, which was successful using the huggingface cli.and similarly the tokensier was pushed.
But while loading in the model card, in HF Website I get No package metadata was found for bitsandbytes in
Lagstill/Varsity_module2_bot
Reproduction
No response
Logs
_No package metadata was found for bitsandbytes_
System info
The text was updated successfully, but these errors were encountered: