OptimizationODE.jl
OptimizationODE.jl provides ODE-based optimization methods as a solver plugin for SciML's Optimization.jl. It wraps various ODE solvers to perform gradient-based optimization using continuous-time dynamics.
Installation
using Pkg
Pkg.add("OptimizationODE")Usage
using OptimizationODE, Optimization, ADTypes, SciMLBase
function f(x, p)
return sum(abs2, x)
end
function g!(g, x, p)
@. g = 2 * x
end
x0 = [2.0, -3.0]
p = []
f_manual = OptimizationFunction(f, SciMLBase.NoAD(); grad = g!)
prob_manual = OptimizationProblem(f_manual, x0)
opt = ODEGradientDescent(dt=0.01)
sol = solve(prob_manual, opt; maxiters=50_000)
@show sol.u
@show sol.objectiveLocal Gradient-based Optimizers
All provided optimizers are gradient-based local optimizers that solve optimization problems by integrating gradient-based ODEs to convergence:
ODEGradientDescent(dt=...)— performs basic gradient descent using the explicit Euler method. This is a simple and efficient method suitable for small-scale or well-conditioned problems.RKChebyshevDescent()— uses the ROCK2 solver, a stabilized explicit Runge-Kutta method suitable for stiff problems. It allows larger step sizes while maintaining stability.RKAccelerated()— leverages the Tsit5 method, a 5th-order Runge-Kutta solver that achieves faster convergence for smooth problems by improving integration accuracy.HighOrderDescent()— applies Vern7, a high-order (7th-order) explicit Runge-Kutta method for even more accurate integration. This can be beneficial for problems requiring high precision.
You can also define a custom optimizer using the generic ODEOptimizer(solver; dt=nothing) constructor by supplying any ODE solver supported by OrdinaryDiffEq.jl.
DAE-based Optimizers
In addition to ODE-based optimizers, OptimizationODE.jl provides optimizers for differential-algebraic equation (DAE) constrained problems:
DAEMassMatrix()— uses the Rodas5P solver (from OrdinaryDiffEq.jl) for DAE problems with a mass matrix formulation.DAEOptimizer(IDA())— uses the IDA solver (from Sundials.jl) for DAE problems with index variable support (requiresusing Sundials)
You can also define a custom optimizer using the generic ODEOptimizer(solver) or DAEOptimizer(solver) constructor by supplying any ODE or DAE solver supported by OrdinaryDiffEq.jl or Sundials.jl.
Interface Details
All optimizers require gradient information (either via automatic differentiation or manually provided grad!). The optimization is performed by integrating the ODE defined by the negative gradient until a steady state is reached.