Using Equality and Inequality Constraints

Multiple optimization packages available with the MathOptInterface and Optim's IPNewton solver can handle non-linear constraints. Optimization.jl provides a simple interface to define the constraint as a Julia function and then specify the bounds for the output in OptimizationFunction to indicate if it's an equality or inequality constraint.

Let's define the rosenbrock function as our objective function and consider the below inequalities as our constraints.

\[\begin{aligned} x_1^2 + x_2^2 \leq 0.8 \\ -1.0 \leq x_1 * x_2 \leq 2.0 \end{aligned}\]

using Optimization, OptimizationMOI, OptimizationOptimJL, Ipopt
using ForwardDiff, ModelingToolkit

rosenbrock(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2
x0 = zeros(2)
_p = [1.0, 1.0]
2-element Vector{Float64}:
 1.0
 1.0

Next, we define the sum of squares and the product of the optimization variables as our constraint functions.

cons(res, x, p) = (res .= [x[1]^2 + x[2]^2, x[1] * x[2]])
cons (generic function with 1 method)

We'll use the IPNewton solver from Optim to solve the problem.

optprob = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff(), cons = cons)
prob = OptimizationProblem(optprob, x0, _p, lcons = [-Inf, -1.0], ucons = [0.8, 2.0])
sol = solve(prob, IPNewton())
retcode: Success
u: 2-element Vector{Float64}:
 0.7519644536194577
 0.484303066780286

Let's check that the constraints are satisfied, and that the objective is lower than at initial values.

res = zeros(2)
cons(res, sol.u, _p)
res
2-element Vector{Float64}:
 0.7999999999999997
 0.36417869099766553
prob.f(sol.u, _p)
0.06810654459826093

We can also use the Ipopt library with the OptimizationMOI package.

sol = solve(prob, Ipopt.Optimizer())
retcode: Success
u: 2-element Vector{Float64}:
 0.7519644519971305
 0.4843030638909282
res = zeros(2)
cons(res, sol.u, _p)
res
2-element Vector{Float64}:
 0.7999999947614852
 0.3641786880392731
prob.f(sol.u, _p)
0.06810654547600103

We can also use ModelingToolkit as our AD backend and generate symbolic derivatives and expression graph for the objective and constraints.

Let's modify the bounds to use the function as an equality constraint. The constraint now becomes -

\[\begin{aligned} x_1^2 + x_2^2 = 1.0 \\ x_1 * x_2 = 0.5 \end{aligned}\]

optprob = OptimizationFunction(rosenbrock, Optimization.AutoModelingToolkit(), cons = cons)
prob = OptimizationProblem(optprob, x0, _p, lcons = [1.0, 0.5], ucons = [1.0, 0.5])
OptimizationProblem. In-place: true
u0: 2-element Vector{Float64}:
 0.0
 0.0

Below, the AmplNLWriter.jl package is used with to use the Ipopt library to solve the problem.

using AmplNLWriter, Ipopt_jll
sol = solve(prob, AmplNLWriter.Optimizer(Ipopt_jll.amplexe))
retcode: Success
u: 2-element Vector{Float64}:
 0.7071678163428006
 0.7070457460302945

The constraints evaluate to 1.0 and 0.5 respectively, as expected.

res = zeros(2)
cons(res, sol.u, _p)
println(res)
[1.0000000074505806, 0.4999999962747097]