Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError: 'BertTokenizer' object is not callable #6

Open
littlelaska opened this issue Nov 16, 2022 · 0 comments
Open

TypeError: 'BertTokenizer' object is not callable #6

littlelaska opened this issue Nov 16, 2022 · 0 comments
Assignees
Labels
enhancement New feature or request

Comments

@littlelaska
Copy link

在将torch代码迁移到MindSpore框架的时候,用到了cybertron替代transformers。
原始代码是:
from transformers import BertTokenizer
model_name_or_path = "dmis-lab/biobert-base-cased-v1.1"
tfm_tokenizer = BertTokenizer.from_pretrained(model_name_or_path)
a = tfm_tokenizer("this is not ok",padding ="max_length", max_length=25, truncation=True, return_tensors="pt")
输出结果如下图:
image

对应的cybertron代码如下:
import cybertron
model_name_or_path = "dmis-lab/biobert-base-cased-v1.1"
ms_tokenizer = cybertron.BertTokenizer.load(model_name_or_path)
ms_tokenizer("this is not ok", padding="max_length", max_length=self.max_length, truncation=True, return_tensors="pt")
运行报错信息如下图:
image

看了下cybertron的源代码,tokenizer的方法没有实现完全,想请作者帮忙修复一下这个问题,感谢

@lvyufeng lvyufeng added the enhancement New feature or request label Dec 9, 2022
@lvyufeng lvyufeng self-assigned this Dec 16, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants