Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pure Python PreTrainedTokenizer is Broken #30696

Open
4 tasks
daskol opened this issue May 7, 2024 · 1 comment
Open
4 tasks

Pure Python PreTrainedTokenizer is Broken #30696

daskol opened this issue May 7, 2024 · 1 comment

Comments

@daskol
Copy link

daskol commented May 7, 2024

System Info

transformers v4.40.2
tokenizers v0.19.1

Who can help?

@ArthurZucker

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

>>> from transformers import GPT2Tokenizer
>>> tokenizer = GPT2Tokenizer.from_pretrained('gpt2')
...
TypeError: unhashable type: 'AddedToken'

Expected behavior

No exception is raised.

@ArthurZucker
Copy link
Collaborator

Sorry but I cannot reproduce, could you make sure you are running this on 4.40?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants