Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Default GA and DE don't seem to work in v0.11.1 #98

Open
ForceBru opened this issue Jul 28, 2022 · 6 comments
Open

Default GA and DE don't seem to work in v0.11.1 #98

ForceBru opened this issue Jul 28, 2022 · 6 comments

Comments

@ForceBru
Copy link

TL;DR

GA() and DE() don't move away from the initial point, say that any initial point is the optimum and report convergence, even though the algorithm isn't anywhere near the optimum.

I'm new to this, so this entire issue might be stupid, but I can't get the algorithms to work even with default settings, I can't even get started with the most basic things. Maybe the defaults could be adjusted somehow?

Basic example

Try to minimize f(x) = x^2:

julia> Evolutionary.optimize(x->x[1]^2, [90.], GA())

 * Status: success

 * Candidate solution
    Minimizer:  [90.0]
    Minimum:    8100.0
    Iterations: 12

 * Found with
    Algorithm: GA[P=50,x=0.8,μ=0.1,ɛ=0]

 * Convergence measures
    |f(x) - f(x')| = 0.0 ≤ 1.0e-12

 * Work counters
    Seconds run:   0.0001 (vs limit Inf)
    Iterations:    12
    f(x) calls:    650


julia> Evolutionary.optimize(x->x[1]^2, [90.], DE())

 * Status: success

 * Candidate solution
    Minimizer:  [90.0]
    Minimum:    8100.0
    Iterations: 12

 * Found with
    Algorithm: DE/random/1/binxvr

 * Convergence measures
    |f(x) - f(x')| = 0.0 ≤ 1.0e-10

 * Work counters
    Seconds run:   0.0006 (vs limit Inf)
    Iterations:    12
    f(x) calls:    600

No, 90^2 = 8100 is most definitely not the minimum of x^2.

I can fiddle with the optimizer's settings, but it still doesn't move from the initial point:

julia> Evolutionary.optimize(x->x[1]^2, [90.], DE(populationSize=1000)) |> Evolutionary.minimizer
1-element Vector{Float64}:
 90.0

julia> Evolutionary.optimize(x->x[1]^2, [90.], DE(populationSize=1000, n=2)) |> Evolutionary.minimizer
1-element Vector{Float64}:
 90.0

julia> Evolutionary.optimize(x->x[1]^2, [90.], DE(n=2)) |> Evolutionary.minimizer
1-element Vector{Float64}:
 90.0

All of these runs also report convergence, but the output is too long.

I eventually got the genetic algo to work after specifying mutation=gaussian(), crossover=uniformbin() (the defaults are apparently no-ops, but having no mutation and no crossover seems to defeat the purpose of having a genetic algorithm?):

julia> Evolutionary.optimize(x->x[1]^2, [90.], GA(mutation=gaussian(), crossover=uniformbin()))

 * Status: success

 * Candidate solution
    Minimizer:  [0.0006461356743545348]
    Minimum:    4.1749130967358945e-7
    Iterations: 267

 * Found with
    Algorithm: GA[P=50,x=0.8,μ=0.1,ɛ=0]

 * Convergence measures
    |f(x) - f(x')| = 0.0 ≤ 1.0e-12

 * Work counters
    Seconds run:   0.0033 (vs limit Inf)
    Iterations:    267
    f(x) calls:    13400

However, I couldn't get differential evolution to work, even for this simple function. Below are examples taken from the docs that don't seem to work either.

GA example

This is taken from https://wildart.github.io/Evolutionary.jl/dev/tutorial/#Obtaining-results.

julia> Evolutionary.optimize(x->-sum(x), BitVector(zeros(3)), Evolutionary.GA())


 * Status: success

 * Candidate solution
    Minimizer:  [false, false, false]
    Minimum:    0
    Iterations: 11

 * Found with
    Algorithm: GA[P=50,x=0.8,μ=0.1,ɛ=0]

 * Convergence measures
    |f(x) - f(x')| = 0.0 ≤ 1.0e-12

 * Work counters
    Seconds run:   0.0066 (vs limit Inf)
    Iterations:    11
    f(x) calls:    600

So apparently, the minimum is f([0,0,0]) = 0. However, I can get a lower value: f([1,1,1]) = -3. In fact, GA seems to accept any initial value as the solution:

julia> Evolutionary.optimize(x->-sum(x), BitVector([0,0,1]), Evolutionary.GA()) |> Evolutionary.minimizer
3-element BitVector:
 0
 0
 1

julia> Evolutionary.optimize(x->-sum(x), BitVector([0,1,0]), Evolutionary.GA()) |> Evolutionary.minimizer
3-element BitVector:
 0
 1
 0

julia> Evolutionary.optimize(x->-sum(x), BitVector([0,1,1]), Evolutionary.GA()) |> Evolutionary.minimizer
3-element BitVector:
 0
 1
 1

DE algorithm

I took the target function from this page: https://wildart.github.io/Evolutionary.jl/dev/constraints/.

julia> f(x)=(x[1]+2x[2]-7)^2+(2x[1]+x[2]-5)^2 # Booth

Then I use the genetic algorithm as shown on that page:

ga = GA(populationSize=100,selection=uniformranking(3),
               mutation=gaussian(),crossover=uniformbin())
Evolutionary.optimize(f, [0., 0.], ga) |> Evolutionary.minimizer

If I run this multiple times (GA seems to be randomized, so I wanted to draw more samples), I get about [1, 3] on average. So far so good.

Now try the default DE() with the same function and the same starting point [0., 0.]:

julia> Evolutionary.optimize(f, [0., 0.], DE()) |> Evolutionary.minimizer
2-element Vector{Float64}:
 0.0
 0.0

julia> Evolutionary.optimize(f, [0., 0.], DE()) |> Evolutionary.minimizer
2-element Vector{Float64}:
 0.0
 0.0

julia> Evolutionary.optimize(f, [0., 0.], DE()) |> Evolutionary.minimizer
2-element Vector{Float64}:
 0.0
 0.0

Change the starting point:

julia> Evolutionary.optimize(f, [0., 100.], DE()) |> Evolutionary.minimizer
2-element Vector{Float64}:
   0.0
 100.0

julia> Evolutionary.optimize(f, [90., 5.], DE()) |> Evolutionary.minimizer
2-element Vector{Float64}:
 90.0
  5.0

julia> Evolutionary.optimize(f, [90., 5.56375], DE()) |> Evolutionary.minimizer
2-element Vector{Float64}:
 90.0
  5.56375

julia> Evolutionary.optimize(f, [π, 5.56375], DE()) |> Evolutionary.minimizer
2-element Vector{Float64}:
 3.141592653589793
 5.56375

DE doesn't seem to care and says that the starting point is the optimum, whatever the starting point is.

GA again

Now try the same function as above with the default GA:

julia> Evolutionary.optimize(f, [90., 5.], GA()) |> Evolutionary.minimizer
2-element Vector{Float64}:
 90.0
  5.0

julia> Evolutionary.optimize(f, [90., π], GA()) |> Evolutionary.minimizer
2-element Vector{Float64}:
 90.0
  3.141592653589793

Same as the default DE: it thinks that the initial point is the optimum, whatever the initial point is.

@ForceBru
Copy link
Author

Looks like I got it. Apparently, when I specify one initial point, the entire population will be just copies of this point:

"""
initial_population(method, individual::AbstractVector)
Initialize population by replicating the `individual` vector.
"""
initial_population(method::M, individual::I; kwargs...) where {M<:AbstractOptimizer, I<:AbstractVector} =
[copy(individual) for i in 1:population_size(method)]

And then there's probably not enough variability for mutation or crossover to change anything, so the population doesn't change.

If I use BoxConstraints, then even DE starts working.

@wildart
Copy link
Owner

wildart commented Aug 5, 2022

Thanks for testing package in such mode. I wouldn't never try to use this algorithms without specific parameters. But, I understand that for someone new trying evolutionary optimization the experience can be frustrating.

You are correct, default parameters are useless. But there is not way to set default parameters for any model because operators are population dependent. If population is represented by binary strings, you need specific to binary string operations, same for numerical functions, and so on. I think the best way is to terminate optimization (with some useful message) when no-ops operators are used.
Same problem with the initial population. Most of the evolutionary algorithms require sufficient randomness in population for proper optimization. Having all population with the same value totally defeats the optimization technique. Adding some noise for the point initialized population may work much better.

If you use BoxConstraints in optimization call, then the population is sampled within the box which beneficially reflects on optimization results.

@ForceBru
Copy link
Author

ForceBru commented Aug 5, 2022

default parameters are useless ... there is not way to set default parameters for any model because operators are population dependent

Maybe there shouldn't be any default parameters then? Having "useless" default parameters is:

  1. confusing and
  2. leads to incorrect examples in the documentation: Evolutionary.optimize(x->-sum(x), BitVector(zeros(3)), GA()) says that [0,0,0] is the minimum, but it's not.

The no-argument constructor could throw an error, for example:

DE() = error("There is not way to set default parameters for any model because operators are population dependent, so please set parameters manually.")

It's especially strange with GA, where the default mutation and crossover operations are no-ops, but mutation and crossover seem to be the entire point of genetic algorithms.

On the other hand, the default DE() seems to work fine when I use BoxConstraints, so default params don't seem all that useless...


Having all population with the same value totally defeats the optimization technique.

Does it mean that optimize methods that accept the indiv parameter aren't particularly useful? With ordinary gradient-based methods, the initial guess is extremely important, so my first instinct was to specify the initial individual.

Also the documentation recommends this: Evolutionary.optimize(f, x0, CMAES()), and it actually works with CMAES, even though the population should consist of copies of x0.

@wildart
Copy link
Owner

wildart commented Aug 6, 2022

It's especially strange with GA, where the default mutation and crossover operations are no-ops, but mutation and crossover seem to be the entire point of genetic algorithms.

Exactly my point, performing evolutionary optimization correctly is not only about initial guess, but also about which mutation operations and how they applied to population, i.e. rates.

Does it mean that optimize methods that accept the indiv parameter aren't particularly useful? With ordinary gradient-based methods, the initial guess is extremely important, so my first instinct was to specify the initial individual.

In current implementation, it is problematic. I think more randomness needs to be added in such case.

Also the documentation recommends this: Evolutionary.optimize(f, x0, CMAES()), and it actually works with CMAES, even though the population should consist of copies of x0.

By default, CMAES initializes parameters with a very convoluted procedure, that was refined for many years of research. CMAES become very fragile with incorrect parameters, and it requires good understanding of algorithm and problem to tune in algorithm for the best performance.

@mylo19
Copy link

mylo19 commented Nov 8, 2022

I came across the same issue.

I suggest you change the (other than that, really helpful) tutorial, so as not to include examples that use default parameters in GA and DE. These examples were the first that I tried to familiarize myself with the package, and it was a bit frustrating to get weird results even if copying the code. There was some time until I realised that there might be a problem with the parameters and not with the syntax I used.

@wildart wildart added the docs label Jan 19, 2023
@jnsbck
Copy link

jnsbck commented Oct 19, 2023

Same here. In my case I was using Evolutionary.jl with Optimization.jl, where you have to set a x0 when defining the OptimizationProblem.

Got it to work by initializing with an initial_population rather than an individual x0. I sampled from a uniform distribution, but I could also imagine just adding randn to x0.

NP = 100
de = Evolutionary.DE(populationSize=NP)
lb, ub = [0 1]
U = Uniform(lb, ub)
x0 = Evolutionary.initial_population(de, [rand(U, 1) for i in 1:NP])

f = OptimizationFunction(foo)
prob = Optimization.OptimizationProblem(f, x0, p, lb=lb, ub=ub)
sol = solve(prob, de)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants