Functions and constraints
Once an expression is created it is possible to create the Terms defining the optimization problem. These can consists of either Smooth functions, Nonsmooth functions, Inequality constraints or Equality constraints.
Smooth functions
StructuredOptimization.ls — Functionls(x::AbstractExpression)Returns the squared norm (least squares) of x:
(shorthand of 1/2*norm(x)^2).
StructuredOptimization.huberloss — Functionhuberloss(x::AbstractExpression, ρ=1.0)Applies the Huber loss function:
StructuredOptimization.sqrhingeloss — Functionsqrhingeloss(x::AbstractExpression, y::Array)Applies the squared Hinge loss function
where y is an array containing $y_i$.
StructuredOptimization.crossentropy — Functioncrossentropy(x::AbstractExpression, y::Array)Applies the cross entropy loss function:
where y is an array of length $N$ containing $y_i$ having $0 \leq y_i \leq 1$.
StructuredOptimization.logisticloss — Functionlogbarrier(x::AbstractExpression, y::AbstractArray)Applies the logistic loss function:
where y is an array containing $y_i$.
LinearAlgebra.dot — Functiondot(c::AbstractVector, x::AbstractExpression)Applies the function:
Nonsmooth functions
LinearAlgebra.norm — Functionnorm(x::AbstractExpression, p=2, [q,] [dim=1])Returns the norm of x.
Supported norms:
p = 0$l_0$-pseudo-normp = 1$l_1$-normp = 2$l_2$-normp = Inf$l_{\infty}$-normp = *nuclear normp = 2,q = 1$l_{2,1}$ mixed norm (aka Sum-of-$l_2$-norms)
where $\mathbf{x}_i$ is the $i$-th column if dim == 1 (or row if dim == 2) of $\mathbf{X}$.
Base.maximum — Functionmaximum(x::AbstractExpression)Applies the function:
StructuredOptimization.sumpositive — Functionsumpositive(x::AbstractExpression, ρ=1.0)Applies the function:
StructuredOptimization.hingeloss — Functionhingeloss(x::AbstractExpression, y::Array)Applies the Hinge loss function
where y is an array containing $y_i$.
StructuredOptimization.logbarrier — Functionlogbarrier(x::AbstractExpression)Applies the log barrier function:
Inequality constraints
Base.:<= — FunctionInequalities constrains
Norm Inequalities constraints
norm(x::AbstractExpression, 0) <= n::Integer$\mathrm{nnz}(\mathbf{x}) \leq n$
norm(x::AbstractExpression, 1) <= r::Number$\sum_i \| x_i \| \leq r$
norm(x::AbstractExpression, 2) <= r::Number$\| \mathbf{x} \| \leq r$
norm(x::AbstractExpression, Inf) <= r::Number$\max \{ x_1, x_2, \dots \} \leq r$
Box inequality constraints
x::AbstractExpression <= u::Union{AbstractArray, Real}$x_i \leq u_i$
x::AbstractExpression >= l::Union{AbstractArray, Real}$x_i \geq l_i$
Notice that the notation
x in [l,u]is also possible.
Rank inequality constraints
rank(X::AbstractExpression) <= n::Integer$\mathrm{rank}(\mathbf{X}) \leq r$
Notice that the expression
Xmust have a codomain with dimension equal to 2.
Equality constraints
Base.:== — FunctionEqualities constraints
Affine space constraint
ex == b::Union{Real,AbstractArray}Requires expression to be affine.
Example
julia> A,b = randn(10,5), randn(10); julia> x = Variable(5); julia> A*x == b
Norm equality constraint
norm(x::AbstractExpression) == r::Number$\| \mathbf{x} \| = r$
Binary constraint
x::AbstractExpression == (l, u)$\mathbf{x} = \mathbf{l}$ or $\mathbf{x} = \mathbf{u}$
Smoothing
Sometimes the optimization problem might involve non-smooth terms which do not have efficiently computable proximal mappings. It is possible to smoothen these terms by means of the Moreau envelope.
StructuredOptimization.smooth — Functionsmooth(t::Term, gamma = 1.0)Smooths the nonsmooth term t using Moreau envelope:
Example
julia> x = Variable(4)
Variable(Float64, (4,))
julia> x = Variable(4);
julia> t = smooth(norm(x,1))
Duality
In some cases it is more convenient to solve the dual problem instead of the primal problem. It is possible to convert a problem into its dual by means of the convex conjugate.
See the Total Variation demo for an example of such procedure.
Base.conj — Functionconj(t::Term)Returns the convex conjugate transform of t:
Example
julia> x = Variable(4)
Variable(Float64, (4,))
julia> x = Variable(4);
julia> t = conj(norm(x,1))