Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BHHH for Likelihood Optimization #1055

Open
ParadaCarleton opened this issue Oct 16, 2023 · 3 comments
Open

BHHH for Likelihood Optimization #1055

ParadaCarleton opened this issue Oct 16, 2023 · 3 comments

Comments

@ParadaCarleton
Copy link

BHHH is a second-order algorithm that (conceptually) uses the self-outer-product of the gradient to approximate the Hessian. This is justified by the information matrix equality in statistics, which states that E(x * x') = E(hessian(x)), making the self-outer-product an unbiased and consistent estimator of the Hessian (that can usually be calculated much more easily than the full Hessian). This method is widely used in statistics.

Is there an implementation of BHHH in Optim.jl, or are there any plans to add it?

@pkofod
Copy link
Member

pkofod commented Dec 12, 2023

I'm well aware of BHHH, but I'm not sure what you would want beyond the Newton method? Is it because you want Optim to automatically write the outer product of the score using AD?

@ParadaCarleton
Copy link
Author

Yep!

@pkofod
Copy link
Member

pkofod commented Jan 26, 2024

Okay, then I suppose you'd have to a) have a vector objective type that can then be interpreted according to some aggregation (I suppose a sum here because you'd have likelihood contributions in your use case) or simply a uhhh-constructor that constructs a normal objective type?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants