Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Any AutoConfig.from_pretrained call results in FutureWarning: resume_download is deprecated and will be removed in version 1.0.0. #2275

Closed
tomaarsen opened this issue May 8, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@tomaarsen
Copy link
Member

tomaarsen commented May 8, 2024

Describe the bug

Any AutoConfig.from_pretrained call results in FutureWarning: resume_download is deprecated and will be removed in version 1.0.0. This is caused by a call internally in AutoConfig which uses resume_download=False as a default option:

https://github.com/huggingface/transformers/blob/508c0bfe555936fc772cd000e2e8da739f777a4f/src/transformers/configuration_utils.py#L650

Feel free to transfer this issue to transformers if you believe the fix should be applied there instead. Either way, we shouldn't get deprecation warnings when using packages normally.

Reproduction

from transformers import AutoConfig

config = AutoConfig.from_pretrained("bert-base-cased")

Logs

[sic]\envs\sentence-transformers\Lib\site-packages\huggingface_hub\file_download.py:1132: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`.
  warnings.warn(
config.json: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 570/570 [00:00<?, ?B/s]

System info

- huggingface_hub version: 0.23.0
- Platform: Windows-10-10.0.22631-SP0
- Python version: 3.11.6
- Running in iPython ?: No
- Running in notebook ?: No
- Running in Google Colab ?: No
- Token path ?: C:\Users\tom\.cache\huggingface\token
- Has saved token ?: True
- Who am I ?: tomaarsen
- Configured git credential helpers: manager
- FastAI: N/A
- Tensorflow: N/A
- Torch: 2.3.0+cu121
- Jinja2: 3.1.2
- Graphviz: N/A
- keras: N/A
- Pydot: N/A
- Pillow: 10.1.0
- hf_transfer: 0.1.6
- gradio: N/A
- tensorboard: N/A
- numpy: 1.26.1
- pydantic: 2.4.2
- aiohttp: 3.8.5
- ENDPOINT: https://huggingface.co
- HF_HUB_CACHE: C:\Users\tom\.cache\huggingface\hub
- HF_ASSETS_CACHE: C:\Users\tom\.cache\huggingface\assets
- HF_TOKEN_PATH: C:\Users\tom\.cache\huggingface\token
- HF_HUB_OFFLINE: False
- HF_HUB_DISABLE_TELEMETRY: False
- HF_HUB_DISABLE_PROGRESS_BARS: None
- HF_HUB_DISABLE_SYMLINKS_WARNING: True
- HF_HUB_DISABLE_EXPERIMENTAL_WARNING: False
- HF_HUB_DISABLE_IMPLICIT_TOKEN: False
- HF_HUB_ENABLE_HF_TRANSFER: False
- HF_HUB_ETAG_TIMEOUT: 10
- HF_HUB_DOWNLOAD_TIMEOUT: 10
@tomaarsen tomaarsen added the bug Something isn't working label May 8, 2024
@aymenkrifa
Copy link

aymenkrifa commented May 13, 2024

@tomaarsen,
huggingface/transformers#30620 fixes the issue for me but a new release hasn’t been rolled out yet! (you can try to install transformers from source to check if it fixes your issue but be careful as the main branch is not always stable, just use it to check if the change solves the issue and we have to wait for a new realease)

@tomaarsen
Copy link
Member Author

tomaarsen commented May 13, 2024

Excellent! Then I think this is all set. Thanks!

  • Tom Aarsen

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants