Skip to content

Commit

Permalink
Merge pull request #838 from SciML/optbasev2.2
Browse files Browse the repository at this point in the history
Update optimisers extensions and tests
  • Loading branch information
Vaibhavdixit02 authored Oct 4, 2024
2 parents 4127df2 + 51eaf46 commit 8a58c5f
Show file tree
Hide file tree
Showing 14 changed files with 78 additions and 91 deletions.
10 changes: 4 additions & 6 deletions NEWS.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,7 @@
# v4 Breaking changes

1. The main change in this breaking release has been the way mini-batching is handled. The data argument in the solve call and the implicit iteration of that in the callback has been removed,
the stochastic solvers (Optimisers.jl and Sophia) now handle it explicitly. You would now pass in a DataLoader to OptimziationProblem as the second argument to the objective etc (p) if you
want to do minibatching, else for full batch just pass in the full data.
1. The main change in this breaking release has been the way mini-batching is handled. The data argument in the solve call and the implicit iteration of that in the callback has been removed,
the stochastic solvers (Optimisers.jl and Sophia) now handle it explicitly. You would now pass in a DataLoader to OptimizationProblem as the second argument to the objective etc (p) if you
want to do minibatching, else for full batch just pass in the full data.

2. The support for extra returns from objective function has been removed. Now the objective should only return a scalar loss value, hence callback doesn't take extra arguments other than the state and loss value.


2. The support for extra returns from objective function has been removed. Now the objective should only return a scalar loss value, hence callback doesn't take extra arguments other than the state and loss value.
8 changes: 1 addition & 7 deletions Project.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
name = "Optimization"
uuid = "7f7a1694-90dd-40f0-9382-eb1efda571ba"
version = "4.0.2"
version = "4.0.3"

[deps]
ADTypes = "47edcb42-4c32-4615-8424-f2b9edc5f35b"
Expand All @@ -11,7 +11,6 @@ LBFGSB = "5be7bae1-8223-5378-bac3-9e7378a2f6e6"
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
Logging = "56ddb016-857b-54e1-b83d-db4d58db5568"
LoggingExtras = "e6f89c97-d47a-5376-807f-9c37f3926c36"
MLUtils = "f1d291b0-491e-4a28-83b9-f70985020b54"
OptimizationBase = "bca83a33-5cc9-4baa-983d-23429ab6bcbb"
Printf = "de0858da-6303-5e67-8744-51eddeeeb8d7"
ProgressLogging = "33c8b6b6-d38a-422a-b730-caa89a2f386c"
Expand All @@ -29,16 +28,11 @@ LBFGSB = "0.4.1"
LinearAlgebra = "1.10"
Logging = "1.10"
LoggingExtras = "0.4, 1"
MLUtils = "0.4.4"
OptimizationBase = "2"
Printf = "1.10"
ProgressLogging = "0.1"
Reexport = "1.2"
SciMLBase = "2.39.0"
SparseArrays = "1.10"
Symbolics = "5.12"
TerminalLoggers = "0.1"
julia = "1.9"

[extras]
Symbolics = "0c5d862f-8b57-4792-8d23-62f2024744c7"
47 changes: 30 additions & 17 deletions docs/src/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -54,110 +54,110 @@ to add the specific wrapper packages.
```@raw html
<details>
<summary><strong>BlackBoxOptim</strong></summary>
- **Global Methods**
- <strong>Global Methods</strong>
- Zeroth order
- Unconstrained
- Box Constraints
</details>
<details>
<summary><strong>CMAEvolutionaryStrategy</strong></summary>
- **Global Methods**
- <strong>Global Methods</strong>
- Zeroth order
- Unconstrained
- Box Constraints
</details>
<details>
<summary><strong>Evolutionary</strong></summary>
- **Global Methods**
- <strong>Global Methods</strong>
- Zeroth order
- Unconstrained
- Box Constraints
- Non-linear Constraints
</details>
<details>
<summary><strong>GCMAES</strong></summary>
- **Global Methods**
- <strong>Global Methods</strong>
- First order
- Box Constraints
- Unconstrained
</details>
<details>
<summary><strong>Manopt</strong></summary>
- **Local Methods**
- <strong>Local Methods</strong>
- First order
- Second order
- Zeroth order
- Box Constraints
- Constrained 🟡
- **Global Methods**
- <strong>Global Methods</strong>
- Zeroth order
- Unconstrained
</details>
<details>
<summary><strong>MathOptInterface</strong></summary>
- **Local Methods**
- <strong>Local Methods</strong>
- First order
- Second order
- Box Constraints
- Constrained
- **Global Methods**
- <strong>Global Methods</strong>
- First order
- Second order
- Constrained
</details>
<details>
<summary><strong>MultistartOptimization</strong></summary>
- **Global Methods**
- <strong>Global Methods</strong>
- Zeroth order
- First order
- Second order
- Box Constraints
</details>
<details>
<summary><strong>Metaheuristics</strong></summary>
- **Global Methods**
- <strong>Global Methods</strong>
- Zeroth order
- Unconstrained
- Box Constraints
</details>
<details>
<summary><strong>NOMAD</strong></summary>
- **Global Methods**
- <strong>Global Methods</strong>
- Zeroth order
- Unconstrained
- Box Constraints
- Constrained 🟡
</details>
<details>
<summary><strong>NLopt</strong></summary>
- **Local Methods**
- <strong>Local Methods</strong>
- First order
- Zeroth order
- Second order 🟡
- Box Constraints
- Local Constrained 🟡
- **Global Methods**
- <strong>Global Methods</strong>
- Zeroth order
- First order
- Unconstrained
- Constrained 🟡
</details>
<details>
<summary><strong>Optim</strong></summary>
- **Local Methods**
- <strong>Local Methods</strong>
- Zeroth order
- First order
- Second order
- Box Constraints
- Constrained
- **Global Methods**
- <strong>Global Methods</strong>
- Zeroth order
- Unconstrained
- Box Constraints
</details>
<details>
<summary><strong>PRIMA</strong></summary>
- **Local Methods**
- <strong>Local Methods</strong>
- Derivative-Free: ✅
- **Constraints**
- Box Constraints: ✅
Expand All @@ -167,13 +167,15 @@ to add the specific wrapper packages.
<summary><strong>QuadDIRECT</strong></summary>
- **Constraints**
- Box Constraints: ✅
- **Global Methods**
- <strong>Global Methods</strong>
- Unconstrained: ✅
</details>
```

🟡 = supported in downstream library but not yet implemented in `Optimization.jl`; PR to add this functionality are welcome

## Citation

```
@software{vaibhav_kumar_dixit_2023_7738525,
author = {Vaibhav Kumar Dixit and Christopher Rackauckas},
Expand All @@ -185,37 +187,48 @@ to add the specific wrapper packages.
url = {https://doi.org/10.5281/zenodo.7738525},
year = 2023}
```

## Reproducibility

```@raw html
<details><summary>The documentation of this SciML package was built using these direct dependencies,</summary>
```

```@example
using Pkg # hide
Pkg.status() # hide
```

```@raw html
</details>
```

```@raw html
<details><summary>and using this machine and Julia version.</summary>
```

```@example
using InteractiveUtils # hide
versioninfo() # hide
```

```@raw html
</details>
```

```@raw html
<details><summary>A more complete overview of all dependencies and their versions is also provided.</summary>
```

```@example
using Pkg # hide
Pkg.status(; mode = PKGMODE_MANIFEST) # hide
```

```@raw html
</details>
```

```@eval
using TOML
using Markdown
Expand Down
2 changes: 1 addition & 1 deletion docs/src/tutorials/certification.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ This works with the `structural_analysis` keyword argument to `OptimizationProbl
We'll use a simple example to illustrate the convexity structure certification process.

```@example symanalysis
using SymbolicAnalysis, Zygote, LinearAlgebra, Optimization, OptimizationMOI
using SymbolicAnalysis, Zygote, LinearAlgebra, Optimization
function f(x, p = nothing)
return exp(x[1]) + x[1]^2
Expand Down
2 changes: 1 addition & 1 deletion docs/src/tutorials/minibatch.md
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ end
function loss_adjoint(fullp, data)
batch, time_batch = data
pred = predict_adjoint(fullp, time_batch)
sum(abs2, batch .- pred), pred
sum(abs2, batch .- pred)
end
k = 10
Expand Down
2 changes: 1 addition & 1 deletion lib/OptimizationOptimJL/Project.toml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
name = "OptimizationOptimJL"
uuid = "36348300-93cb-4f02-beb5-3c3902f8871e"
authors = ["Vaibhav Dixit <[email protected]> and contributors"]
version = "0.4.0"
version = "0.4.1"

[deps]
Optim = "429524aa-4258-5aef-a3af-852621145aeb"
Expand Down
50 changes: 31 additions & 19 deletions lib/OptimizationOptimJL/src/OptimizationOptimJL.jl
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,7 @@ function SciMLBase.requireshessian(opt::Union{
true
end
SciMLBase.requiresgradient(opt::Optim.Fminbox) = true
# SciMLBase.allowsfg(opt::Union{Optim.AbstractOptimizer, Optim.ConstrainedOptimizer, Optim.Fminbox, Optim.SAMIN}) = true

function __map_optimizer_args(cache::OptimizationCache,
opt::Union{Optim.AbstractOptimizer, Optim.Fminbox,
Expand Down Expand Up @@ -142,11 +143,11 @@ function SciMLBase.__solve(cache::OptimizationCache{
θ = metadata[cache.opt isa Optim.NelderMead ? "centroid" : "x"]
opt_state = Optimization.OptimizationState(iter = trace.iteration,
u = θ,
objective = x[1],
objective = trace.value,
grad = get(metadata, "g(x)", nothing),
hess = get(metadata, "h(x)", nothing),
original = trace)
cb_call = cache.callback(opt_state, x...)
cb_call = cache.callback(opt_state, trace.value)
if !(cb_call isa Bool)
error("The callback should return a boolean `halt` for whether to stop the optimization process.")
end
Expand Down Expand Up @@ -261,11 +262,11 @@ function SciMLBase.__solve(cache::OptimizationCache{
metadata["x"]
opt_state = Optimization.OptimizationState(iter = trace.iteration,
u = θ,
objective = x[1],
objective = trace.value,
grad = get(metadata, "g(x)", nothing),
hess = get(metadata, "h(x)", nothing),
original = trace)
cb_call = cache.callback(opt_state, x...)
cb_call = cache.callback(opt_state, trace.value)
if !(cb_call isa Bool)
error("The callback should return a boolean `halt` for whether to stop the optimization process.")
end
Expand All @@ -277,14 +278,19 @@ function SciMLBase.__solve(cache::OptimizationCache{
__x = first(x)
return cache.sense === Optimization.MaxSense ? -__x : __x
end
fg! = function (G, θ)
if G !== nothing
cache.f.grad(G, θ)
if cache.sense === Optimization.MaxSense
G .*= -one(eltype(G))

if cache.f.fg === nothing
fg! = function (G, θ)
if G !== nothing
cache.f.grad(G, θ)
if cache.sense === Optimization.MaxSense
G .*= -one(eltype(G))
end
end
return _loss(θ)
end
return _loss(θ)
else
fg! = cache.f.fg
end

gg = function (G, θ)
Expand Down Expand Up @@ -344,9 +350,9 @@ function SciMLBase.__solve(cache::OptimizationCache{
u = metadata["x"],
grad = get(metadata, "g(x)", nothing),
hess = get(metadata, "h(x)", nothing),
objective = x[1],
objective = trace.value,
original = trace)
cb_call = cache.callback(opt_state, x...)
cb_call = cache.callback(opt_state, trace.value)
if !(cb_call isa Bool)
error("The callback should return a boolean `halt` for whether to stop the optimization process.")
end
Expand All @@ -358,15 +364,21 @@ function SciMLBase.__solve(cache::OptimizationCache{
__x = first(x)
return cache.sense === Optimization.MaxSense ? -__x : __x
end
fg! = function (G, θ)
if G !== nothing
cache.f.grad(G, θ)
if cache.sense === Optimization.MaxSense
G .*= -one(eltype(G))

if cache.f.fg === nothing
fg! = function (G, θ)
if G !== nothing
cache.f.grad(G, θ)
if cache.sense === Optimization.MaxSense
G .*= -one(eltype(G))
end
end
return _loss(θ)
end
return _loss(θ)
else
fg! = cache.f.fg
end

gg = function (G, θ)
cache.f.grad(G, θ)
if cache.sense === Optimization.MaxSense
Expand Down Expand Up @@ -434,7 +446,7 @@ PrecompileTools.@compile_workload begin
function obj_f(x, p)
A = p[1]
b = p[2]
return sum((A * x - b) .^ 2)
return sum((A * x .- b) .^ 2)
end

function solve_nonnegative_least_squares(A, b, solver)
Expand Down
Loading

0 comments on commit 8a58c5f

Please sign in to comment.