Skip to content

Optimization groups #1630

Closed Answered by nathanielsimard
aegroto asked this question in Q&A
Discussion options

You must be logged in to vote

We dont support optimization groups just yet, at least with a convenient API. However, you could create multiple optimizers with different learning rates and extract the gradients from the backward pass into multiple gradients params. You can do that using a visitor by grouping the gradients into each parameter group you want.

Replies: 2 comments

Comment options

You must be logged in to vote
0 replies
Answer selected by aegroto
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants