Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is there any way to avoid overfitting while training over 200epoch? #202

Open
Chanuku opened this issue Sep 15, 2020 · 3 comments
Open

Is there any way to avoid overfitting while training over 200epoch? #202

Chanuku opened this issue Sep 15, 2020 · 3 comments

Comments

@Chanuku
Copy link

Chanuku commented Sep 15, 2020

overfitting

as written on title, I'm finding for the hints to avoid overfitting while training over 200epoch

I tried to change learning rate, but it didn't work

@junyanz
Copy link
Collaborator

junyanz commented Sep 16, 2020

It seems that your learning rate has turned negative. This might cause the training failure.

@Chanuku
Copy link
Author

Chanuku commented Sep 16, 2020

Thanks for the reply,

If you know, can you give me any clue about which parameter should I fix to avoid this?

@junyanz
Copy link
Collaborator

junyanz commented Sep 17, 2020

You need to update your learning rate decay policy.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants