Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MultiHeadedAttention与onmt结果不对应 #259

Open
ArtyZe opened this issue Feb 26, 2022 · 1 comment
Open

MultiHeadedAttention与onmt结果不对应 #259

ArtyZe opened this issue Feb 26, 2022 · 1 comment

Comments

@ArtyZe
Copy link

ArtyZe commented Feb 26, 2022

您好,非常感谢您这边的分享,我在测试turbo_transoformers的时候发现一个问题,如果Q矩阵有负数,或者mask矩阵有True的时候,turbo_transformers和onmt的输出结果就无法对应上了,请问是否我测试方法有问题?

image

@feifeibear
Copy link
Collaborator

应该和Q矩阵正负没关系,你可以去单测跑一下你的输入数据?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants