Releases: SciML/Optimization.jl
Releases · SciML/Optimization.jl
v4.0.5
Optimization v4.0.5
Merged pull requests:
- Update
_check_and_convert_maxiters
to ensure rounding (#849) (@Vaibhavdixit02)
Closed issues:
v4.0.4
Optimization v4.0.4
Merged pull requests:
- fix: match iterations count in Sophia (#841) (@avik-pal)
- Update links to algorithms in optim.md (#846) (@AbdAlazezAhmed)
- MOI vector lambda and iteration fixes in Optimisers (#847) (@Vaibhavdixit02)
v4.0.3
Optimization v4.0.3
Merged pull requests:
- feat: make MLUtils into a weakdep & suppport MLDataDevices (#827) (@avik-pal)
- fix: no printing "dkjht" (#828) (@avik-pal)
- Optimisers iteration count should be calculated with epochs and data both (#830) (@Vaibhavdixit02)
- NLopt: Reuse constraint evaluations (#832) (@Vaibhavdixit02)
- Update index.md (#836) (@Vaibhavdixit02)
- Update optimisers extensions and tests (#838) (@Vaibhavdixit02)
Closed issues:
- Multi objective optimization (#18)
PolyOpt
only accept functions without any extra inputs (#728)- Disassociate batching from the
data
arg tosolve
and removedata
(#776) - Move Sophia from OptimizationOptimisers to
src/
(#783) - Support for
MOI.eval_constraint_jacobian_transpose_product
(#808) - docs: index.md: broken html after v3.26 (#834)
- OptimizationFunction cannot return multiple values (#839)
v4.0.2
Optimization v4.0.2
v4.0.1
Optimization v4.0.1
v4.0.0
Optimization v4.0.0
Merged pull requests:
- [WIP] Updates for OptimizationBase v2 (#789) (@Vaibhavdixit02)
- Add constraints support for NLopt (#799) (@Vaibhavdixit02)
- Add constraints support for NOMAD (#817) (@Vaibhavdixit02)
- Bump versions (#818) (@Vaibhavdixit02)
- docs updates (#819) (@Vaibhavdixit02)
v3.28.0
Optimization v3.28.0
Merged pull requests:
- CompatHelper: bump compat for OptimizationPRIMA to 0.2 for package docs, (keep existing compat) (#786) (@github-actions[bot])
- Updating solver __solve function for MOO (#787) (@ParasPuneetSingh)
- New tutorials and docs (#788) (@Vaibhavdixit02)
- Add a conversion mechanism between NLPModel to OptimizationProblem (#792) (@alonsoC1s)
- CompatHelper: add new compat entry for SymbolicAnalysis at version 0.2 for package docs, (keep existing compat) (#794) (@github-actions[bot])
- CompatHelper: add new compat entry for Symbolics at version 6 for package docs, (keep existing compat) (#796) (@github-actions[bot])
- CompatHelper: bump compat for Symbolics to 6 for package OptimizationMOI, (keep existing compat) (#797) (@github-actions[bot])
- CompatHelper: add new compat entry for SymbolicAnalysis at version 0.3 for package docs, (keep existing compat) (#800) (@github-actions[bot])
- Add remake docs (#801) (@Vaibhavdixit02)
- CompatHelper: add new compat entry for NLPModelsTest at version 0.10 for package docs, (keep existing compat) (#803) (@github-actions[bot])
- CompatHelper: add new compat entry for NLPModels at version 0.21 for package docs, (keep existing compat) (#804) (@github-actions[bot])
- Update __loss function for MOO using OptimizationMetaheuristics.jl (#805) (@ParasPuneetSingh)
- refactor: remove dependency on Pkg (#807) (@SebastianM-C)
- Added precompilation for nonnegative least squares (#809) (@arismavridis)
- call cons_vjp if available (#811) (@baggepinnen)
Closed issues:
- Add an abstraction for chaining optimizers (#78)
- Maxiters not respected (#335)
- Segfault with NLopt (#344)
- Parallelize local optimizations with MultistartOptimization (#377)
- Add more polyalgorithms such as mixture of global (BBO/CMAES/...) and first order and second order optimizers (#523)
- Downstream Compat bumps (#669)
- Adding a thin wrapper to NLPModels (#790)
- maxtime documentation doesn't mention units (seconds) (#795)
- Optimization.LBFGS fails with MethodError: no method matching take(::Base.Iterators.Cycle{Tuple{OptimizationBase.NullData}}, ::Nothing) (#798)
v3.27.0
v3.26.3
Optimization v3.26.3
Merged pull requests:
- Add more retcode testing and fix space in lbfgsb return (#782) (@Vaibhavdixit02)
v3.26.2
Optimization v3.26.2
Merged pull requests:
- Add another lbfgsb stopping criteria for retcode handling (#781) (@Vaibhavdixit02)