Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

429 error in InferenceClient #2175

Open
4 tasks
sooryansatheesh opened this issue Mar 29, 2024 · 2 comments
Open
4 tasks

429 error in InferenceClient #2175

sooryansatheesh opened this issue Mar 29, 2024 · 2 comments

Comments

@sooryansatheesh
Copy link

System Info

@SunMarc

429 Client Error: Too Many Requests for url: https://api-inference.huggingface.co/models

Who can help?

I got the above error when I was trying to get tabular classification predictions from my own model

I used the code below
`from huggingface_hub import InferenceClient
input_data = [2, 3, 4, 2, 4]
df = pd.DataFrame([input_data], columns=cols_used)

client = InferenceClient()

table = df.to_dict(orient="records")

print(table)
client.tabular_classification(table=table, model=model_id)`

Can someone help me?

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

I got the above error when I was trying to get tabular classification predictions from my own model

I used the code below
`from huggingface_hub import InferenceClient
input_data = [2, 3, 4, 2, 4]
df = pd.DataFrame([input_data], columns=cols_used)

client = InferenceClient()

table = df.to_dict(orient="records")

print(table)
client.tabular_classification(table=table, model=model_id)`

Can someone help me?

Expected behavior

Prediction in the form of a single number from the model

@ArthurZucker ArthurZucker transferred this issue from huggingface/transformers Mar 30, 2024
@ArthurZucker
Copy link

cc @Wauplin

@Wauplin
Copy link
Contributor

Wauplin commented Apr 2, 2024

@sooryansatheesh In your reproducible script above

from huggingface_hub import InferenceClient
input_data = [2, 3, 4, 2, 4]
df = pd.DataFrame([input_data], columns=cols_used)

client = InferenceClient()

table = df.to_dict(orient="records")

print(table)
client.tabular_classification(table=table, model=model_id)

would you mind sharing what values you used for cols_used and model_id? Without it, it's hard to reproduce.

In general HTTP 429 means you got rate limited. Using an hf token should lift the rate limit up which might solve your situation. Another possibility is that your model doesn't load on our inference API servers but to investigate that we would need the model id.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants