Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

No package metadata was found for bitsandbytes #2249

Open
Lagstill opened this issue Apr 24, 2024 · 6 comments
Open

No package metadata was found for bitsandbytes #2249

Lagstill opened this issue Apr 24, 2024 · 6 comments
Labels
bug Something isn't working

Comments

@Lagstill
Copy link

Describe the bug

Used the following:

quantization_config = BitsAndBytesConfig(load_in_4bit=True,
bnb_4bit_use_double_quant=True,
bnb_4bit_compute_dtype=torch.bfloat16)

model = AutoModelForCausalLM.from_pretrained(model_id,
device="cpu",
token=ACCESS_TOKEN)

and

model.push_to_hub('Lagstill/Varsity_module2_bot'), which was successful using the huggingface cli.
and similarly the tokensier was pushed.

But while loading in the model card, in HF Website I get No package metadata was found for bitsandbytes in Lagstill/Varsity_module2_bot

Reproduction

No response

Logs

_No package metadata was found for bitsandbytes_

System info

python-dotenv~=1.0.1
torch~=2.2.2
transformers~=4.38.2
bitsandbytes~=0.42.0; sys_platform == 'darwin'
bitsandbytes~=0.43.1; 'darwin' not in sys_platform 
accelerate~=0.28.0
pypdf~=4.1.0
tiktoken~=0.6.0
sentence-transformers~=2.5.1
faiss-cpu~=1.8.0
langchain~=0.1.16
streamlit~=1.29.0
@Lagstill Lagstill added the bug Something isn't working label Apr 24, 2024
@Wauplin
Copy link
Contributor

Wauplin commented Apr 24, 2024

Hi @Lagstill, sorry you're facing this issue. It looks like this is more related to bitsandbytes rather than the huggingface_hub package, right? Might be worth opening the issue directly on their repo if that's the case.

Otherwise, could you explain more But while loading in the model card, in HF Website I get No package metadata was found for bitsandbytes in Lagstill/Varsity_module2_bot. Is this in your browser? If yes, can you share a screenshot? Or is this in a script and in this case what is the script and the full stacktrace?

@jeevanchaps
Copy link

i have the same issue. any solutions?

@Wauplin
Copy link
Contributor

Wauplin commented Apr 30, 2024

Same answer as what I mentioned above, I would need more context to help you investigate the issue. Also worth asking directly on bitsandbytes project :)

@jeevanchaps
Copy link

jeevanchaps commented Apr 30, 2024

I was fine-tuning a LLM model and I installed all the required libraries including bits and bytes. After I completed the fine-tuning I pushed the model to the huggingface hub. Once pushed to the huggingface hub it should have been straightforward like other model cards but when using Inferene API on the model that I pushed it says No package metadata was found for bitsandbytes.

Screenshot 2024-04-30 at 8 52 09 AM

so like in the image above when you click compute the problem arises as the model has been already pushed into the hugging face hub,

Hope this clear you what the problem is. IF you need further information https://discuss.huggingface.co/t/inference-api-for-fine-tuned-model-not-working-no-package-metadata-was-found-for-bitsandbytes/75457 here is a full discussion on huggingface forum which hasn't been solved yet as of now.

@Wauplin
Copy link
Contributor

Wauplin commented May 3, 2024

Thanks for clarification @jeevanchaps. I reported it internally. I'm linking the PR that should fix it https://github.com/huggingface/api-inference/pull/1902/ (private repo). I'll let you know once it's deployed.

@jeevanchaps
Copy link

jeevanchaps commented May 3, 2024

Hi @Wauplin I have another concern in the hugging face as well. I used this Sakonii/distilbert-base-nepali model to fine-tune my custom dataset and then pushed it to the hugging face hub. When I try to use the inference API on huggingface hub I get the error "Can't load tokenizer using from_pretrained, please update its configuration: data did not match any variant of untagged enum PyPreTokenizerTypeWrapper at line 83 column 3".

Any ideas or solutions for it? Using this model as a pipeline on local works perfectly fine but in the huggingface hub inference it gives error as I mentioned above. The model card that I pushed my model is xap/Sentiment_Analysis_NepaliCovidTweets

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants