Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Minor error in Finetuning example code #67

Open
Baekpica opened this issue Nov 29, 2021 · 0 comments
Open

Minor error in Finetuning example code #67

Baekpica opened this issue Nov 29, 2021 · 0 comments

Comments

@Baekpica
Copy link

In 'freeze' method, there is trivial error i think.

When user intend to freeze attention layers, user will set freeze_attn=True.

But, there is no layers contained 'attn' in names of model.module.named_parameters()! (It can not filtered with condition "elif 'attn' in name:")
So, when user freezes 'other' layers, layers contained 'attention' will be freezed, too.

It will be proper to revise "'attn' in name" to "'attention' in name".

Anyway, thank you all developers of ruDALL-E for providing awesome pre-trained model!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant