You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
would you mind sharing what values you used for cols_used and model_id? Without it, it's hard to reproduce.
In general HTTP 429 means you got rate limited. Using an hf token should lift the rate limit up which might solve your situation. Another possibility is that your model doesn't load on our inference API servers but to investigate that we would need the model id.
System Info
@SunMarc
429 Client Error: Too Many Requests for url: https://api-inference.huggingface.co/models
Who can help?
I got the above error when I was trying to get tabular classification predictions from my own model
I used the code below
`from huggingface_hub import InferenceClient
input_data = [2, 3, 4, 2, 4]
df = pd.DataFrame([input_data], columns=cols_used)
client = InferenceClient()
table = df.to_dict(orient="records")
print(table)
client.tabular_classification(table=table, model=model_id)`
Can someone help me?
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
I got the above error when I was trying to get tabular classification predictions from my own model
I used the code below
`from huggingface_hub import InferenceClient
input_data = [2, 3, 4, 2, 4]
df = pd.DataFrame([input_data], columns=cols_used)
client = InferenceClient()
table = df.to_dict(orient="records")
print(table)
client.tabular_classification(table=table, model=model_id)`
Can someone help me?
Expected behavior
Prediction in the form of a single number from the model
The text was updated successfully, but these errors were encountered: