Optimization.jl is a package with a scope that is beyond your normal global optimization package. Optimization.jl seeks to bring together all of the optimization packages it can find, local and global, into one unified Julia interface. This means, you learn one package and you learn them all! Optimization.jl adds a few high-level features, such as integrating with automatic differentiation, to make its usage fairly simple for most cases, while allowing all of the options in a single unified interface.
Assuming that you already have Julia correctly installed, it suffices to import Optimization.jl in the standard way:
using Pkg
Pkg.add("Optimization")
The packages relevant to the core functionality of Optimization.jl will be imported accordingly and, in most cases, you do not have to worry about the manual installation of dependencies. Below is the list of packages that need to be installed explicitly if you intend to use the specific optimization algorithms offered by them:
- OptimizationBBO for BlackBoxOptim.jl
- OptimizationEvolutionary for Evolutionary.jl (see also this documentation)
- OptimizationGCMAES for GCMAES.jl
- OptimizationMOI for MathOptInterface.jl (usage of algorithm via MathOptInterface API; see also the API documentation)
- OptimizationMetaheuristics for Metaheuristics.jl (see also this documentation)
- OptimizationMultistartOptimization for MultistartOptimization.jl (see also this documentation)
- OptimizationNLopt for NLopt.jl (usage via the NLopt API; see also the available algorithms)
- OptimizationNOMAD for NOMAD.jl (see also this documentation)
- OptimizationNonconvex for Nonconvex.jl (see also this documentation)
- OptimizationQuadDIRECT for QuadDIRECT.jl
- OptimizationSpeedMapping for SpeedMapping.jl (see also this documentation)
For information on using the package, see the stable documentation. Use the in-development documentation for the version of the documentation, which contains the unreleased features.
using Optimization
rosenbrock(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2
x0 = zeros(2)
p = [1.0, 100.0]
prob = OptimizationProblem(rosenbrock, x0, p)
using OptimizationOptimJL
sol = solve(prob, NelderMead())
using OptimizationBBO
prob = OptimizationProblem(rosenbrock, x0, p, lb = [-1.0, -1.0], ub = [1.0, 1.0])
sol = solve(prob, BBO_adaptive_de_rand_1_bin_radiuslimited())
Note that Optim.jl is a core dependency of Optimization.jl. However, BlackBoxOptim.jl is not and must already be installed (see the list above).
Warning: The output of the second optimization task (BBO_adaptive_de_rand_1_bin_radiuslimited()
) is
currently misleading in the sense that it returns Status: failure (reached maximum number of iterations)
. However, convergence is actually
reached and the confusing message stems from the reliance on the Optim.jl output
struct (where the situation of reaching the maximum number of iterations is
rightly regarded as a failure). The improved output struct will soon be
implemented.
The output of the first optimization task (with the NelderMead()
algorithm)
is given below:
* Status: success
* Candidate solution
Final objective value: 3.525527e-09
* Found with
Algorithm: Nelder-Mead
* Convergence measures
√(Σ(yᵢ-ȳ)²)/n ≤ 1.0e-08
* Work counters
Seconds run: 0 (vs limit Inf)
Iterations: 60
f(x) calls: 118
We can also explore other methods in a similar way:
using ForwardDiff
f = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff())
prob = OptimizationProblem(f, x0, p)
sol = solve(prob, BFGS())
For instance, the above optimization task produces the following output:
* Status: success
* Candidate solution
Final objective value: 7.645684e-21
* Found with
Algorithm: BFGS
* Convergence measures
|x - x'| = 3.48e-07 ≰ 0.0e+00
|x - x'|/|x'| = 3.48e-07 ≰ 0.0e+00
|f(x) - f(x')| = 6.91e-14 ≰ 0.0e+00
|f(x) - f(x')|/|f(x')| = 9.03e+06 ≰ 0.0e+00
|g(x)| = 2.32e-09 ≤ 1.0e-08
* Work counters
Seconds run: 0 (vs limit Inf)
Iterations: 16
f(x) calls: 53
∇f(x) calls: 53
prob = OptimizationProblem(f, x0, p, lb = [-1.0, -1.0], ub = [1.0, 1.0])
sol = solve(prob, Fminbox(GradientDescent()))
The examples clearly demonstrate that Optimization.jl provides an intuitive way of specifying optimization tasks and offers a relatively easy access to a wide range of optimization algorithms.