Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add iterator interface #745

Open
wants to merge 6 commits into
base: master
Choose a base branch
from
Open

Add iterator interface #745

wants to merge 6 commits into from

Conversation

tkf
Copy link
Contributor

@tkf tkf commented Sep 9, 2019

This PR adds an iterator interface for optimizer. I tentatively call it Optim.optimizing. It provides an iterator-based API such that

let istate
    for istate′ in optimizing(args...; kwargs...)
        istate = istate′
    end
    OptimizationResults(istate)
end

is equivalent to optimize(args...; kwargs...) for multivariate optimizations.

close #599

@with_kw struct IteratorState{IT <: OptimIterator, TR <: OptimizationTrace}
# Put `OptimIterator` in iterator state so that `OptimizationResults` can
# be constructed from `IteratorState`.
iter::IT
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can drop iter field from IteratorState if we change the API to:

let istate
    iter = optimizing(args...; kwargs...)
    for istate′ in iter
        istate = istate′
    end
    OptimizationResults(iter, istate)  # need to pass `iter` here
end

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Accessor functions like iteration_limit_reached(istate) and (f|g|h)_calls(istate) need .iter, too.

@pkofod
Copy link
Member

pkofod commented Sep 9, 2019

Interesting, thanks. That took surprisingly few changes, but it was mostly written as an iterator already, just wasn't an actual iterator :laugh . Will have to have a more thorough look!

@codecov
Copy link

codecov bot commented Sep 9, 2019

Codecov Report

Merging #745 into master will decrease coverage by 0.04%.
The diff coverage is 80.55%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #745      +/-   ##
==========================================
- Coverage   81.46%   81.41%   -0.05%     
==========================================
  Files          43       43              
  Lines        2422     2438      +16     
==========================================
+ Hits         1973     1985      +12     
- Misses        449      453       +4
Impacted Files Coverage Δ
src/multivariate/optimize/optimize.jl 77.96% <80%> (-1.11%) ⬇️
src/multivariate/optimize/interface.jl 76.81% <81.81%> (ø) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update b51e742...cc594a8. Read the comment docs.

@codecov
Copy link

codecov bot commented Sep 9, 2019

Codecov Report

Merging #745 into master will decrease coverage by 0.11%.
The diff coverage is 73.23%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #745      +/-   ##
==========================================
- Coverage   81.53%   81.41%   -0.12%     
==========================================
  Files          43       43              
  Lines        2431     2470      +39     
==========================================
+ Hits         1982     2011      +29     
- Misses        449      459      +10
Impacted Files Coverage Δ
src/api.jl 61.29% <23.07%> (-1.64%) ⬇️
src/multivariate/optimize/optimize.jl 80.26% <83.72%> (+0.26%) ⬆️
src/multivariate/optimize/interface.jl 78.08% <86.66%> (+1.27%) ⬆️
src/multivariate/solvers/constrained/samin.jl 75.91% <0%> (-0.73%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 20b2994...8a64efb. Read the comment docs.


function OptimizationResults(istate::IteratorState)
@unpack_IteratorState istate
@unpack d, initial_x, method, options, state = iter

after_while!(d, state, method, options)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Executing a mutating-function after_while! inside non-bang function OptimizationResults is not super great. But it looks like after_while! is a no-op mostly except for NelderMead so maybe it's OK? From a quick look, after_while! for NelderMead seems to be idempotent (so that, e.g., calling OptimizationResults(istate) inside a loop multiple times is OK). If that's the case, OptimizationResults(istate) practically has no side-effect?

But using a function like result! sounds good to me as well.

if !isa(r.method, NelderMead)
throw(ArgumentError("There is no centroid involved in optimization using $(r.method). Please use x_trace(...) to grab the points from the trace."))
function centroid_trace(r::Union{MultivariateOptimizationResults, IteratorState})
tr = trace(r)
Copy link
Contributor Author

@tkf tkf Sep 10, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I suppose tr = trace(r) should be added here? I think it'll throw UndefVarError otherwise. There are two more places I did this change.

I'm including these changes (adding tr = trace(r)) although they are not directly related to PR.

@tkf
Copy link
Contributor Author

tkf commented Sep 10, 2019

@pkofod After posted the PR, I realized that the first version didn't provide the way to access iteration state (e.g., getting "best minimizer so far") so I added accessor functions for iterator state by reusing the function names/code already defined for the result type. So the diff may not be as small as you remember now.

Anyway, I think I've implemented the APIs I need. I'm looking forward to hear your thoughts on this.

@tkf
Copy link
Contributor Author

tkf commented Sep 10, 2019

Aside: Using this API a few times, I realized that istate = istate′ pattern for leaking the iteration variable to the outer scope is rather ugly although I guess there is not much we can do. For a subset of problem, defining foreach like below is kind of useful.

function Base.foreach(f, iter::OptimIterator)
    local istate
    for istate′ in iter
        istate = istate′
        f(istate)
    end
    return istate
end

We can then use it as

result = foreach(optimizing(args...; kwargs...)) do istate
    @info "Minimizing..." minimizer(istate) minimum(istate)
end |> Optim.OptimizationResults

although this does not let us use break.

(Or we can implement transducer-compatible foldl so that foreach is as powerful as for loop 😉)

@tkf
Copy link
Contributor Author

tkf commented Sep 27, 2019

Friendly ping :)

Is there anything I can do to make it easier to merge?

@pkofod
Copy link
Member

pkofod commented Sep 27, 2019

Sorry, I've had too much on my plate. I'll take a look soon.

@tkf
Copy link
Contributor Author

tkf commented Sep 27, 2019

Thanks!

@cmey
Copy link

cmey commented Mar 14, 2023

🛎️ Can we have this?

I am still interested in plotting the evolution of the optimization : https://discourse.julialang.org/t/plotting-the-evolution-of-the-optimization-using-optim-jl/41978

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Offer iterator interface
3 participants