Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Returning KL divergence #75

Open
Wilco17 opened this issue Jan 28, 2020 · 4 comments
Open

Returning KL divergence #75

Wilco17 opened this issue Jan 28, 2020 · 4 comments
Labels
enhancement New feature or request good first issue Good for newcomers

Comments

@Wilco17
Copy link

Wilco17 commented Jan 28, 2020

Thank you for this fantastic work!

Could it be possible the fit_transform() method returns the KL divergence of the run?

Thx!

@DavidMChan DavidMChan added enhancement New feature or request good first issue Good for newcomers labels Jan 29, 2020
@stu-blair
Copy link

related question, @DavidMChan , is the Avg. Gradient Norm printed to the log the same/analogous to KLD ?

@DavidMChan
Copy link
Member

DavidMChan commented Jul 23, 2021

The average gradient norm is basically the norm of the gradient of the KL-divergence with respect to the particle positions. Thus, it can be a proxy for how stable the optimization process is, but is not the same as the KL.

@stu-blair
Copy link

Thanks for the explanation! That makes sense.

So, just to confirm-- there's no way at all currently to see the KL/D value ?

@DavidMChan
Copy link
Member

DavidMChan commented Jul 23, 2021

Currently, no - but I'll consider working it into the next version, and we're always welcome to PRs if anyone wants to contribute!

Here would be a good place in the code to start looking:

float grad_norm = tsnecuda::util::L2NormDeviceVector(old_forces_device);

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request good first issue Good for newcomers
Projects
None yet
Development

No branches or pull requests

3 participants