You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In 'freeze' method, there is trivial error i think.
When user intend to freeze attention layers, user will set freeze_attn=True.
But, there is no layers contained 'attn' in names of model.module.named_parameters()! (It can not filtered with condition "elif 'attn' in name:")
So, when user freezes 'other' layers, layers contained 'attention' will be freezed, too.
It will be proper to revise "'attn' in name" to "'attention' in name".
Anyway, thank you all developers of ruDALL-E for providing awesome pre-trained model!
The text was updated successfully, but these errors were encountered:
In 'freeze' method, there is trivial error i think.
When user intend to freeze attention layers, user will set freeze_attn=True.
But, there is no layers contained 'attn' in names of model.module.named_parameters()! (It can not filtered with condition "elif 'attn' in name:")
So, when user freezes 'other' layers, layers contained 'attention' will be freezed, too.
It will be proper to revise "'attn' in name" to "'attention' in name".
Anyway, thank you all developers of ruDALL-E for providing awesome pre-trained model!
The text was updated successfully, but these errors were encountered: