You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
And notices a few differences. I was wondering if someone can shed some light on which of the two is correct.
First, while brent.jl uses:
if abs(new_minimizer - x_midpoint) <= 2*x_tol - (x_upper-x_lower)/2
As stopping criterium, brentmin.hl has the second condition turned around:
if abs(x - m) > 2 * tol - (b - a) / 2 # stopping crit
(b-a instead of a-b). A quick test in my own implementation shows that brentmin.jl seems to be incorrect as even simple functions don't converge anymore.
Both are correct, the conditions are turned around because in Optim.jl the code path with the condition you mention stops if true and in NLSolvers.jl the code paths continues if true, hence the swapped inequality. The reference implementation in Brent's book has it the same way NLSolvers.jl does
So if that condition is true (no convergence) it continues.
As per the second part, that is true, it should be copysign(tol, d) as per brent's book
Hello,
I was comparing the implementations of:
And notices a few differences. I was wondering if someone can shed some light on which of the two is correct.
First, while brent.jl uses:
As stopping criterium, brentmin.hl has the second condition turned around:
(b-a instead of a-b). A quick test in my own implementation shows that brentmin.jl seems to be incorrect as even simple functions don't converge anymore.
Then, when selecting the new step, brent.jl does:
while brentmin.jl dies:
So it treats the case of d == 0 differently.
The text was updated successfully, but these errors were encountered: