Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

预测问题 #99

Open
terminator123 opened this issue Jul 27, 2022 · 2 comments
Open

预测问题 #99

terminator123 opened this issue Jul 27, 2022 · 2 comments

Comments

@terminator123
Copy link

直接用CDial-GPT2_LCCC-base 预测
预测那部分代码有修改,不然跑不通

output = model(input_ids, token_type_ids=token_type_ids)
logits = output.logits
logits = logits[0, -1, :] / args.temperature

不管输入是什么,结果都如下

[12997, 7635, 12997, 7635, 12997, 12997, 12997, 12997, 7635, 12997, 7635, 12997, 7635, 12997, 7635, 12997, 7635, 12997, 7635, 12997, 12997, 12997, 7635, 7635, 12997, 7635, 12997, 7635, 7635, 12997]
囌 僦 囌 僦 囌 囌 囌 囌 僦 囌 僦 囌 僦 囌 僦 囌 僦 囌 僦 囌 囌 囌 僦 僦 囌 僦 囌 僦 僦 囌

@xuexidi
Copy link

xuexidi commented Sep 1, 2022

一样的问题,同问~

@wasssily
Copy link

注释掉这一句:logits, *_ = model(input_ids, token_type_ids=token_type_ids)
然后改成:
output = model(input_ids, token_type_ids=token_type_ids)
logits = output.logits
logits = logits[0, -1, :] / args.temperature
logits = top_filtering(logits, top_k=args.top_k, top_p=args.top_p)
probs = F.softmax(logits, dim=-1)
出来就是正常的对话了

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants