Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

想问下turbo支持huggingface的bart模型么 #257

Open
will-wiki opened this issue Jan 20, 2022 · 4 comments
Open

想问下turbo支持huggingface的bart模型么 #257

will-wiki opened this issue Jan 20, 2022 · 4 comments

Comments

@will-wiki
Copy link

No description provided.

@will-wiki
Copy link
Author

will-wiki commented Jan 20, 2022

目前只看到了bert和decoder的支持demo,不知道对于bart这种encoder-decoder类型的模型是否支持,是两个组件分开调用turbo么

@feifeibear
Copy link
Collaborator

feifeibear commented Jan 21, 2022

Turbo支持过标准的encoder-decoder NMT模型。对于bart的细节我没研究过,我认为方法应该类似。
https://github.com/TurboNLP/Translate-Demo/blob/master/mytranslator.py

@will-wiki
Copy link
Author

will-wiki commented Jan 21, 2022

好的 十分感谢,不过看demo里面只对decoder部分做了处理,是因为encoder只编码一次 这边只对主要耗时的解码器加速吧

@feifeibear
Copy link
Collaborator

例子里应该都处理了吧,encoder和bert类似,改起来很简单。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants