Документация Engee

API

Страница в процессе перевода.

JuMP

An algebraic modeling language for Julia.

For more information, go to https://jump.dev.

ALMOST_DUAL_INFEASIBLE::TerminationStatusCode

An instance of the TerminationStatusCode enum.

ALMOST_DUAL_INFEASIBLE: The algorithm concluded that no dual bound exists for the problem within relaxed tolerances.

ALMOST_INFEASIBLE::TerminationStatusCode

An instance of the TerminationStatusCode enum.

ALMOST_INFEASIBLE: The algorithm concluded that no feasible solution exists within relaxed tolerances.

ALMOST_LOCALLY_SOLVED::TerminationStatusCode

An instance of the TerminationStatusCode enum.

ALMOST_LOCALLY_SOLVED: The algorithm converged to a stationary point, local optimal solution, or could not find directions for improvement within relaxed tolerances.

ALMOST_OPTIMAL::TerminationStatusCode

An instance of the TerminationStatusCode enum.

ALMOST_OPTIMAL: The algorithm found a globally optimal solution to relaxed tolerances.

moi_backend field holds a CachingOptimizer in AUTOMATIC mode.

moi_backend field holds an AbstractOptimizer. No extra copy of the model is stored. The moi_backend must support add_constraint etc.

DUAL_INFEASIBLE::TerminationStatusCode

An instance of the TerminationStatusCode enum.

DUAL_INFEASIBLE: The algorithm concluded that no dual bound exists for the problem. If, additionally, a feasible (primal) solution is known to exist, this status typically implies that the problem is unbounded, with some technical exceptions.

FEASIBILITY_SENSE::OptimizationSense

An instance of the OptimizationSense enum.

FEASIBILITY_SENSE: the model does not have an objective function

FEASIBLE_POINT::ResultStatusCode

An instance of the ResultStatusCode enum.

FEASIBLE_POINT: the result vector is a feasible point.

INFEASIBILITY_CERTIFICATE::ResultStatusCode

An instance of the ResultStatusCode enum.

INFEASIBILITY_CERTIFICATE: the result vector is an infeasibility certificate. If the PrimalStatus is INFEASIBILITY_CERTIFICATE, then the primal result vector is a certificate of dual infeasibility. If the DualStatus is INFEASIBILITY_CERTIFICATE, then the dual result vector is a proof of primal infeasibility.

INFEASIBLE::TerminationStatusCode

An instance of the TerminationStatusCode enum.

INFEASIBLE: The algorithm concluded that no feasible solution exists.

INFEASIBLE_OR_UNBOUNDED::TerminationStatusCode

An instance of the TerminationStatusCode enum.

INFEASIBLE_OR_UNBOUNDED: The algorithm stopped because it decided that the problem is infeasible or unbounded; this occasionally happens during MIP presolve.

INFEASIBLE_POINT::ResultStatusCode

An instance of the ResultStatusCode enum.

INFEASIBLE_POINT: the result vector is an infeasible point.

INTERRUPTED::TerminationStatusCode

An instance of the TerminationStatusCode enum.

INTERRUPTED: The algorithm stopped because of an interrupt signal.

INVALID_MODEL::TerminationStatusCode

An instance of the TerminationStatusCode enum.

INVALID_MODEL: The algorithm stopped because the model is invalid.

INVALID_OPTION::TerminationStatusCode

An instance of the TerminationStatusCode enum.

INVALID_OPTION: The algorithm stopped because it was provided an invalid option.

ITERATION_LIMIT::TerminationStatusCode

An instance of the TerminationStatusCode enum.

ITERATION_LIMIT: An iterative algorithm stopped after conducting the maximum number of iterations.

LOCALLY_INFEASIBLE::TerminationStatusCode

An instance of the TerminationStatusCode enum.

LOCALLY_INFEASIBLE: The algorithm converged to an infeasible point or otherwise completed its search without finding a feasible solution, without guarantees that no feasible solution exists.

LOCALLY_SOLVED::TerminationStatusCode

An instance of the TerminationStatusCode enum.

LOCALLY_SOLVED: The algorithm converged to a stationary point, local optimal solution, could not find directions for improvement, or otherwise completed its search without global guarantees.

moi_backend field holds a CachingOptimizer in MANUAL mode.

MAX_SENSE::OptimizationSense

An instance of the OptimizationSense enum.

MAX_SENSE: the goal is to maximize the objective function

MEMORY_LIMIT::TerminationStatusCode

An instance of the TerminationStatusCode enum.

MEMORY_LIMIT: The algorithm stopped because it ran out of memory.

MIN_SENSE::OptimizationSense

An instance of the OptimizationSense enum.

MIN_SENSE: the goal is to minimize the objective function

NEARLY_FEASIBLE_POINT::ResultStatusCode

An instance of the ResultStatusCode enum.

NEARLY_FEASIBLE_POINT: the result vector is feasible if some constraint tolerances are relaxed.

NEARLY_INFEASIBILITY_CERTIFICATE::ResultStatusCode

An instance of the ResultStatusCode enum.

NEARLY_INFEASIBILITY_CERTIFICATE: the result satisfies a relaxed criterion for a certificate of infeasibility.

NEARLY_REDUCTION_CERTIFICATE::ResultStatusCode

An instance of the ResultStatusCode enum.

NEARLY_REDUCTION_CERTIFICATE: the result satisfies a relaxed criterion for an ill-posed certificate.

NODE_LIMIT::TerminationStatusCode

An instance of the TerminationStatusCode enum.

NODE_LIMIT: A branch-and-bound algorithm stopped because it explored a maximum number of nodes in the branch-and-bound tree.

NORM_LIMIT::TerminationStatusCode

An instance of the TerminationStatusCode enum.

NORM_LIMIT: The algorithm stopped because the norm of an iterate became too large.

NO_SOLUTION::ResultStatusCode

An instance of the ResultStatusCode enum.

NO_SOLUTION: the result vector is empty.

NUMERICAL_ERROR::TerminationStatusCode

An instance of the TerminationStatusCode enum.

NUMERICAL_ERROR: The algorithm stopped because it encountered unrecoverable numerical error.

OBJECTIVE_LIMIT::TerminationStatusCode

An instance of the TerminationStatusCode enum.

OBJECTIVE_LIMIT: The algorithm stopped because it found a solution better than a minimum limit set by the user.

OPTIMAL::TerminationStatusCode

An instance of the TerminationStatusCode enum.

OPTIMAL: The algorithm found a globally optimal solution.

OPTIMIZE_NOT_CALLED::TerminationStatusCode

An instance of the TerminationStatusCode enum.

OPTIMIZE_NOT_CALLED: The algorithm has not started.

OTHER_ERROR::TerminationStatusCode

An instance of the TerminationStatusCode enum.

OTHER_ERROR: The algorithm stopped because of an error not covered by one of the statuses defined above.

OTHER_LIMIT::TerminationStatusCode

An instance of the TerminationStatusCode enum.

OTHER_LIMIT: The algorithm stopped due to a limit not covered by one of the LIMIT statuses above.

OTHER_RESULT_STATUS::ResultStatusCode

An instance of the ResultStatusCode enum.

OTHER_RESULT_STATUS: the result vector contains a solution with an interpretation not covered by one of the statuses defined above

REDUCTION_CERTIFICATE::ResultStatusCode

An instance of the ResultStatusCode enum.

REDUCTION_CERTIFICATE: the result vector is an ill-posed certificate; see this article for details. If the PrimalStatus is REDUCTION_CERTIFICATE, then the primal result vector is a proof that the dual problem is ill-posed. If the DualStatus is REDUCTION_CERTIFICATE, then the dual result vector is a proof that the primal is ill-posed.

SLOW_PROGRESS::TerminationStatusCode

An instance of the TerminationStatusCode enum.

SLOW_PROGRESS: The algorithm stopped because it was unable to continue making progress towards the solution.

SOLUTION_LIMIT::TerminationStatusCode

An instance of the TerminationStatusCode enum.

SOLUTION_LIMIT: The algorithm stopped because it found the required number of solutions. This is often used in MIPs to get the solver to return the first feasible solution it encounters.

TIME_LIMIT::TerminationStatusCode

An instance of the TerminationStatusCode enum.

TIME_LIMIT: The algorithm stopped after a user-specified computation time.

UNKNOWN_RESULT_STATUS::ResultStatusCode

An instance of the ResultStatusCode enum.

UNKNOWN_RESULT_STATUS: the result vector contains a solution with an unknown interpretation.

const _CONSTRAINT_LIMIT_FOR_PRINTING = Ref{Int}(100)

A global constant used to control when constraints are omitted when printing the model.

Get and set this value using _CONSTRAINT_LIMIT_FOR_PRINTING[].

julia> _CONSTRAINT_LIMIT_FOR_PRINTING[]
100

julia> _CONSTRAINT_LIMIT_FOR_PRINTING[] = 10
10
const _TERM_LIMIT_FOR_PRINTING = Ref{Int}(60)

A global constant used to control when terms are omitted when printing expressions.

Get and set this value using _TERM_LIMIT_FOR_PRINTING[].

julia> _TERM_LIMIT_FOR_PRINTING[]
60

julia> _TERM_LIMIT_FOR_PRINTING[] = 10
10
op_and(x, y)

A function that falls back to x & y, but when called with JuMP variables or expressions, returns a GenericNonlinearExpr.

Example

julia> model = Model();

julia> @variable(model, x);

julia> op_and(true, false)
false

julia> op_and(true, x)
true && x
op_equal_to(x, y)

A function that falls back to x == y, but when called with JuMP variables or expressions, returns a GenericNonlinearExpr.

Example

julia> model = Model();

julia> @variable(model, x);

julia> op_equal_to(2, 2)
true

julia> op_equal_to(x, 2)
x == 2
op_greater_than_or_equal_to(x, y)

A function that falls back to x >= y, but when called with JuMP variables or expressions, returns a GenericNonlinearExpr.

Example

julia> model = Model();

julia> @variable(model, x);

julia> op_greater_than_or_equal_to(2, 2)
true

julia> op_greater_than_or_equal_to(x, 2)
x >= 2
op_less_than_or_equal_to(x, y)

A function that falls back to x <= y, but when called with JuMP variables or expressions, returns a GenericNonlinearExpr.

Example

julia> model = Model();

julia> @variable(model, x);

julia> op_less_than_or_equal_to(2, 2)
true

julia> op_less_than_or_equal_to(x, 2)
x <= 2
op_or(x, y)

A function that falls back to x | y, but when called with JuMP variables or expressions, returns a GenericNonlinearExpr.

Example

julia> model = Model();

julia> @variable(model, x);

julia> op_or(true, false)
true

julia> op_or(true, x)
true || x
op_strictly_greater_than(x, y)

A function that falls back to x > y, but when called with JuMP variables or expressions, returns a GenericNonlinearExpr.

Example

julia> model = Model();

julia> @variable(model, x);

julia> op_strictly_greater_than(1, 2)
false

julia> op_strictly_greater_than(x, 2)
x > 2
op_strictly_less_than(x, y)

A function that falls back to x < y, but when called with JuMP variables or expressions, returns a GenericNonlinearExpr.

Example

julia> model = Model();

julia> @variable(model, x);

julia> op_strictly_less_than(1, 2)
true

julia> op_strictly_less_than(x, 2)
x < 2
abstract type AbstractConstraint

An abstract base type for all constraint types. AbstractConstraints store the function and set directly, unlike ConstraintRefs that are merely references to constraints stored in a model. AbstractConstraints do not need to be attached to a model.

AbstractJuMPScalar <: MutableArithmetics.AbstractMutable

Abstract base type for all scalar types

The subtyping of AbstractMutable will allow calls of some Base functions to be redirected to a method in MA that handles type promotion more carefully (for example the promotion in sparse matrix products in SparseArrays usually does not work for JuMP types) and exploits the mutability of AffExpr and QuadExpr.

AbstractModel

An abstract type that should be subtyped for users creating JuMP extensions.

AbstractScalarSet

An abstract type for defining new scalar sets in JuMP.

Implement moi_set(::AbstractScalarSet) to convert the type into an MOI set.

See also: moi_set.

AbstractShape

Abstract vectorizable shape. Given a flat vector form of an object of shape shape, the original object can be obtained by reshape_vector.

AbstractVariable

Variable returned by build_variable. It represents a variable that has not been added yet to any model. It can be added to a given model with add_variable.

AbstractVariableRef

Variable returned by add_variable. Affine (resp. quadratic) operations with variables of type V<:AbstractVariableRef and coefficients of type T create a GenericAffExpr{T,V} (resp. GenericQuadExpr{T,V}).

AbstractVectorSet

An abstract type for defining new sets in JuMP.

Implement moi_set(::AbstractVectorSet, dim::Int) to convert the type into an MOI set.

See also: moi_set.

AffExpr

Alias for GenericAffExpr{Float64,VariableRef}, the specific GenericAffExpr used by JuMP.

ArrayShape{N}(dims::NTuple{N,Int}) where {N}

An AbstractShape that represents array-valued constraints.

Example

julia> model = Model();

julia> @variable(model, x[1:2, 1:3]);

julia> c = @constraint(model, x >= 0, Nonnegatives())
[x[1,1]  x[1,2]  x[1,3]
 x[2,1]  x[2,2]  x[2,3]] ∈ Nonnegatives()

julia> shape(constraint_object(c))
ArrayShape{2}((2, 3))
BridgeableConstraint(
    constraint::C,
    bridge_type::B;
    coefficient_type::Type{T} = Float64,
) where {C<:AbstractConstraint,B<:Type{<:MOI.Bridges.AbstractBridge},T}

An AbstractConstraint representinng that constraint that can be bridged by the bridge of type bridge_type{coefficient_type}.

Adding a BridgeableConstraint to a model is equivalent to:

add_bridge(model, bridge_type; coefficient_type = coefficient_type)
add_constraint(model, constraint)

Example

Given a new scalar set type CustomSet with a bridge CustomBridge that can bridge F-in-CustomSet constraints, when the user does:

model = Model()
@variable(model, x)
@constraint(model, x + 1 in CustomSet())
optimize!(model)

with an optimizer that does not support F-in-CustomSet constraints, the constraint will not be bridged unless they first call add_bridge(model, CustomBridge).

In order to automatically add the CustomBridge to any model to which an F-in-CustomSet is added, add the following method:

function JuMP.build_constraint(
    error_fn::Function,
    func::AbstractJuMPScalar,
    set::CustomSet,
)
    constraint = ScalarConstraint(func, set)
    return BridgeableConstraint(constraint, CustomBridge)
end

Note

JuMP extensions should extend JuMP.build_constraint only if they also defined CustomSet, for three reasons:

  1. It is problematic if multiple extensions overload the same JuMP method.

  2. A missing method will not inform the users that they forgot to load the extension module defining the build_constraint method.

  3. Defining a method where neither the function nor any of the argument types are defined in the package is called type piracy and is discouraged in the Julia style guide.

ComplexPlane

Complex plane object that can be used to create a complex variable in the @variable macro.

Example

Consider the following example:

julia> model = Model();

julia> @variable(model, x in ComplexPlane())
real(x) + imag(x) im

julia> all_variables(model)
2-element Vector{VariableRef}:
 real(x)
 imag(x)

We see in the output of the last command that two real variables were created. The Julia variable x binds to an affine expression in terms of these two variables that parametrize the complex plane.

ComplexVariable{S,T,U,V} <: AbstractVariable

A struct used when adding complex variables.

See also: ComplexPlane.

struct ConstraintNotOwned{C<:ConstraintRef} <: Exception
    constraint_ref::C
end

An error thrown when the constraint constraint_ref was used in a model different to owner_model(constraint_ref).

Example

julia> model = Model();

julia> @variable(model, x);

julia> @constraint(model, c, x >= 0)
c : x ≥ 0

julia> model_new = Model();

julia> MOI.get(model_new, MOI.ConstraintName(), c)
ERROR: ConstraintNotOwned{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.GreaterThan{Float64}}, ScalarShape}}(c : x ≥ 0)
Stacktrace:
[...]
ConstraintRef

Holds a reference to the model and the corresponding MOI.ConstraintIndex.

mutable struct GenericAffExpr{CoefType,VarType} <: AbstractJuMPScalar
    constant::CoefType
    terms::OrderedDict{VarType,CoefType}
end

An expression type representing an affine expression of the form: .

Fields

  • .constant: the constant c in the expression.

  • .terms: an OrderedDict, with keys of VarType and values of CoefType describing the sparse vector a.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> expr = x[2] + 3.0 * x[1] + 4.0
x[2] + 3 x[1] + 4

julia> expr.constant
4.0

julia> expr.terms
OrderedCollections.OrderedDict{VariableRef, Float64} with 2 entries:
  x[2] => 1.0
  x[1] => 3.0
GenericAffExpr(constant::V, kv::Vararg{Pair{K,V},N}) where {K,V,N}

Create a GenericAffExpr by passing a constant and pairs of additional arguments.

Example

julia> model = Model();

julia> @variable(model, x);

julia> GenericAffExpr(1.0, x => 1.0)
x + 1
GenericAffExpr(constant::V, kv::AbstractArray{Pair{K,V}}) where {K,V}

Create a GenericAffExpr by passing a constant and a vector of pairs.

Example

julia> model = Model();

julia> @variable(model, x);

julia> GenericAffExpr(1.0, [x => 1.0])
x + 1
GenericModel{T}(
    [optimizer_factory;]
    add_bridges::Bool = true,
) where {T<:Real}

Create a new instance of a JuMP model.

If optimizer_factory is provided, the model is initialized with the optimizer returned by MOI.instantiate(optimizer_factory).

If optimizer_factory is not provided, use set_optimizer to set the optimizer before calling optimize!.

If add_bridges, JuMP adds a MOI.Bridges.LazyBridgeOptimizer to automatically reformulate the problem into a form supported by the optimizer.

Value type T

Passing a type other than Float64 as the value type T is an advanced operation. The value type must match that expected by the chosen optimizer. Consult the optimizers documentation for details.

If not documented, assume that the optimizer supports only Float64.

Choosing an unsupported value type will throw an MOI.UnsupportedConstraint or an MOI.UnsupportedAttribute error, the timing of which (during the model construction or during a call to optimize!) depends on how the solver is interfaced to JuMP.

Example

julia> model = GenericModel{BigFloat}();

julia> typeof(model)
GenericModel{BigFloat}
GenericNonlinearExpr{V}(head::Symbol, args::Vector{Any})
GenericNonlinearExpr{V}(head::Symbol, args::Any...)

The scalar-valued nonlinear function head(args...), represented as a symbolic expression tree, with the call operator head and ordered arguments in args.

V is the type of AbstractVariableRef present in the expression, and is used to help dispatch JuMP extensions.

head

The head::Symbol must be an operator supported by the model.

The default list of supported univariate operators is given by:

  • MOI.Nonlinear.DEFAULT_UNIVARIATE_OPERATORS

and the default list of supported multivariate operators is given by:

  • MOI.Nonlinear.DEFAULT_MULTIVARIATE_OPERATORS

Additional operators can be add using @operator.

See the full list of operators supported by a MOI.ModelLike by querying the MOI.ListOfSupportedNonlinearOperators attribute.

args

The vector args contains the arguments to the nonlinear function. If the operator is univariate, it must contain one element. Otherwise, it may contain multiple elements.

Given a subtype of AbstractVariableRef, V, for GenericNonlinearExpr{V}, each element must be one of the following:

where T<:Real and T == value_type(V).

Unsupported operators

If the optimizer does not support head, an MOI.UnsupportedNonlinearOperator error will be thrown.

There is no guarantee about when this error will be thrown; it may be thrown when the function is first added to the model, or it may be thrown when optimize! is called.

Example

To represent the function , do:

julia> model = Model();

julia> @variable(model, x)
x

julia> f = sin(x)^2
sin(x) ^ 2.0

julia> f = GenericNonlinearExpr{VariableRef}(
           :^,
           GenericNonlinearExpr{VariableRef}(:sin, x),
           2.0,
       )
sin(x) ^ 2.0
mutable struct GenericQuadExpr{CoefType,VarType} <: AbstractJuMPScalar
    aff::GenericAffExpr{CoefType,VarType}
    terms::OrderedDict{UnorderedPair{VarType}, CoefType}
end

An expression type representing an quadratic expression of the form: .

Fields

  • .aff: an GenericAffExpr representing the affine portion of the expression.

  • .terms: an OrderedDict, with keys of UnorderedPair{VarType} and values of CoefType, describing the sparse list of terms q.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> expr = 2.0 * x[1]^2 + x[1] * x[2] + 3.0 * x[1] + 4.0
2 x[1]² + x[1]*x[2] + 3 x[1] + 4

julia> expr.aff
3 x[1] + 4

julia> expr.terms
OrderedCollections.OrderedDict{UnorderedPair{VariableRef}, Float64} with 2 entries:
  UnorderedPair{VariableRef}(x[1], x[1]) => 2.0
  UnorderedPair{VariableRef}(x[1], x[2]) => 1.0
GenericQuadExpr(
    aff::GenericAffExpr{V,K},
    kv::AbstractArray{Pair{UnorderedPair{K},V}}
) where {K,V}

Create a GenericQuadExpr by passing a GenericAffExpr and a vector of (UnorderedPair, coefficient) pairs.

Example

julia> model = Model();

julia> @variable(model, x);

julia> GenericQuadExpr(GenericAffExpr(1.0, x => 2.0), [UnorderedPair(x, x) => 3.0])
3 x² + 2 x + 1
GenericReferenceMap{T}

Mapping between variable and constraint reference of a model and its copy. The reference of the copied model can be obtained by indexing the map with the reference of the corresponding reference of the original model.

GenericVariableRef{T} <: AbstractVariableRef

Holds a reference to the model and the corresponding MOI.VariableIndex.

GenericVariableRef{T}(c::ConstraintRef)

Get the variable associated with a ConstraintRef, if c is a constraint on a single variable.

Example

julia> model = Model();

julia> @variable(model, x >= 0)
x

julia> c = LowerBoundRef(x)
x ≥ 0

julia> VariableRef(c) == x
true
GreaterThanZero()

A struct used to intercept when >= or is used in a macro via operator_to_set.

This struct is not the same as Nonnegatives so that we can disambiguate x >= y and x - y in Nonnegatives().

This struct is not intended for general usage, but it may be useful to some JuMP extensions.

Example

julia> operator_to_set(error, Val(:>=))
GreaterThanZero()
HermitianMatrixAdjointShape(side_dimension)

This shape is not intended for regular use.

HermitianMatrixShape(
    side_dimension::Int;
    needs_adjoint_dual::Bool = false,
)

The shape object for a Hermitian square matrix of side_dimension rows and columns.

The vectorized form corresponds to MOI.HermitianPositiveSemidefiniteConeTriangle.

needs_adjoint_dual

By default, the dual_shape of HermitianMatrixShape is also HermitianMatrixShape. This is true for cases such as a LinearAlgebra.Hermitian matrix in HermitianPSDCone.

However, JuMP also supports LinearAlgebra.Hermitian matrix in Zeros, which is interpreted as an element-wise equality constraint. By exploiting symmetry, we pass only the upper triangle of the equality constraints. This works for the primal, but it leads to a factor of 2 difference in the off-diagonal dual elements. (The dual value of the (i, j) element in the triangle formulation should be divided by 2 when spread across the (i, j) and (j, i) elements in the square matrix formulation.) If the constraint has this dual inconsistency, set needs_adjoint_dual = true.

HermitianMatrixSpace()

Use in the @variable macro to constrain a matrix of variables to be hermitian.

Example

julia> model = Model();

julia> @variable(model, Q[1:2, 1:2] in HermitianMatrixSpace())
2×2 LinearAlgebra.Hermitian{GenericAffExpr{ComplexF64, VariableRef}, Matrix{GenericAffExpr{ComplexF64, VariableRef}}}:
 real(Q[1,1])                    real(Q[1,2]) + imag(Q[1,2]) im
 real(Q[1,2]) - imag(Q[1,2]) im  real(Q[2,2])
HermitianPSDCone

Hermitian positive semidefinite cone object that can be used to create a Hermitian positive semidefinite square matrix in the @variable and @constraint macros.

Example

Consider the following example:

julia> model = Model();

julia> @variable(model, H[1:3, 1:3] in HermitianPSDCone())
3×3 LinearAlgebra.Hermitian{GenericAffExpr{ComplexF64, VariableRef}, Matrix{GenericAffExpr{ComplexF64, VariableRef}}}:
 real(H[1,1])                    …  real(H[1,3]) + imag(H[1,3]) im
 real(H[1,2]) - imag(H[1,2]) im     real(H[2,3]) + imag(H[2,3]) im
 real(H[1,3]) - imag(H[1,3]) im     real(H[3,3])

julia> all_variables(model)
9-element Vector{VariableRef}:
 real(H[1,1])
 real(H[1,2])
 real(H[2,2])
 real(H[1,3])
 real(H[2,3])
 real(H[3,3])
 imag(H[1,2])
 imag(H[1,3])
 imag(H[2,3])

julia> all_constraints(model, Vector{VariableRef}, MOI.HermitianPositiveSemidefiniteConeTriangle)
1-element Vector{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.VectorOfVariables, MathOptInterface.HermitianPositiveSemidefiniteConeTriangle}}}:
 [real(H[1,1]), real(H[1,2]), real(H[2,2]), real(H[1,3]), real(H[2,3]), real(H[3,3]), imag(H[1,2]), imag(H[1,3]), imag(H[2,3])] ∈ MathOptInterface.HermitianPositiveSemidefiniteConeTriangle(3)

We see in the output of the last commands that 9 real variables were created. The matrix H constrains affine expressions in terms of these 9 variables that parametrize a Hermitian matrix.

LPMatrixData{T}

The struct returned by lp_matrix_data. See lp_matrix_data for a description of the public fields.

GreaterThanZero()

A struct used to intercept when <= or is used in a macro via operator_to_set.

This struct is not the same as Nonpositives so that we can disambiguate x <= y and x - y in Nonpositives().

This struct is not intended for general usage, but it may be useful to some JuMP extensions.

Example

julia> operator_to_set(error, Val(:<=))
LessThanZero()
LinearTermIterator{GAE<:GenericAffExpr}

A struct that implements the iterate protocol in order to iterate over tuples of (coefficient, variable) in the GenericAffExpr.

Model([optimizer_factory;] add_bridges::Bool = true)

Create a new instance of a JuMP model.

If optimizer_factory is provided, the model is initialized with thhe optimizer returned by MOI.instantiate(optimizer_factory).

If optimizer_factory is not provided, use set_optimizer to set the optimizer before calling optimize!.

If add_bridges, JuMP adds a MOI.Bridges.LazyBridgeOptimizer to automatically reformulate the problem into a form supported by the optimizer.

Example

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> solver_name(model)
"Ipopt"

julia> import HiGHS

julia> import MultiObjectiveAlgorithms as MOA

julia> model = Model(() -> MOA.Optimizer(HiGHS.Optimizer); add_bridges = false);
ModelMode

An enum to describe the state of the CachingOptimizer inside a JuMP model.

See also: mode.

Values

Possible values are:

struct NoOptimizer <: Exception end

An error thrown when no optimizer is set and one is required.

The optimizer can be provided to the Model constructor or by calling set_optimizer.

Example

julia> model = Model();

julia> optimize!(model)
ERROR: NoOptimizer()
Stacktrace:
[...]
NonlinearConstraintRef
Совместимость

This type is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling.

NonlinearExpr

Alias for GenericNonlinearExpr{VariableRef}, the specific GenericNonlinearExpr used by JuMP.

NonlinearExpression <: AbstractJuMPScalar

A struct to represent a nonlinear expression.

Create an expression using @NLexpression.

Совместимость

This type is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling.

NonlinearOperator(func::Function, head::Symbol)

A callable struct (functor) representing a function named head.

When called with AbstractJuMPScalars, the struct returns a GenericNonlinearExpr.

When called with non-JuMP types, the struct returns the evaluation of func(args...).

Unless head is special-cased by the optimizer, the operator must have already been added to the model using add_nonlinear_operator or @operator.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> f(x::Float64) = x^2
f (generic function with 1 method)

julia> ∇f(x::Float64) = 2 * x
∇f (generic function with 1 method)

julia> ∇²f(x::Float64) = 2.0
∇²f (generic function with 1 method)

julia> @operator(model, op_f, 1, f, ∇f, ∇²f)
NonlinearOperator(f, :op_f)

julia> bar = NonlinearOperator(f, :op_f)
NonlinearOperator(f, :op_f)

julia> @objective(model, Min, bar(x))
op_f(x)

julia> bar(2.0)
4.0
NonlinearParameter <: AbstractJuMPScalar

A struct to represent a nonlinear parameter.

Create a parameter using @NLparameter.

Совместимость

This type is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling.

Nonnegatives()

The JuMP equivalent of the MOI.Nonnegatives set, in which the dimension is inferred from the corresponding function.

Example

julia> model = Model();

julia> @variable(model, x[1:2])
2-element Vector{VariableRef}:
 x[1]
 x[2]

julia> @constraint(model, x in Nonnegatives())
[x[1], x[2]] ∈ Nonnegatives()

julia> A = [1 2; 3 4];

julia> b = [5, 6];

julia> @constraint(model, A * x >= b)
[x[1] + 2 x[2] - 5, 3 x[1] + 4 x[2] - 6] ∈ Nonnegatives()
Nonpositives()

The JuMP equivalent of the MOI.Nonpositives set, in which the dimension is inferred from the corresponding function.

Example

julia> model = Model();

julia> @variable(model, x[1:2])
2-element Vector{VariableRef}:
 x[1]
 x[2]

julia> @constraint(model, x in Nonpositives())
[x[1], x[2]] ∈ Nonpositives()

julia> A = [1 2; 3 4];

julia> b = [5, 6];

julia> @constraint(model, A * x <= b)
[x[1] + 2 x[2] - 5, 3 x[1] + 4 x[2] - 6] ∈ Nonpositives()
struct OptimizeNotCalled <: Exception end

An error thrown when a result attribute cannot be queried before optimize! is called.

Example

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> objective_value(model)
ERROR: OptimizeNotCalled()
Stacktrace:
[...]
PSDCone

Positive semidefinite cone object that can be used to constrain a square matrix to be positive semidefinite in the @constraint macro.

If the matrix has type Symmetric then the columns vectorization (the vector obtained by concatenating the columns) of its upper triangular part is constrained to belong to the MOI.PositiveSemidefiniteConeTriangle set, otherwise its column vectorization is constrained to belong to the MOI.PositiveSemidefiniteConeSquare set.

Example

Non-symmetric case:

julia> model = Model();

julia> @variable(model, x);

julia> a = [x 2x; 2x x];

julia> b = [1 2; 2 4];

julia> cref = @constraint(model, a >= b, PSDCone())
[x - 1    2 x - 2
 2 x - 2  x - 4] ∈ PSDCone()

julia> jump_function(constraint_object(cref))
4-element Vector{AffExpr}:
 x - 1
 2 x - 2
 2 x - 2
 x - 4

julia> moi_set(constraint_object(cref))
MathOptInterface.PositiveSemidefiniteConeSquare(2)

Symmetric case:

julia> using LinearAlgebra # For Symmetric

julia> model = Model();

julia> @variable(model, x);

julia> a = [x 2x; 2x x];

julia> b = [1 2; 2 4];

julia> cref = @constraint(model, Symmetric(a - b) in PSDCone())
[x - 1  2 x - 2
 ⋯      x - 4] ∈ PSDCone()

julia> jump_function(constraint_object(cref))
3-element Vector{AffExpr}:
 x - 1
 2 x - 2
 x - 4

julia> moi_set(constraint_object(cref))
MathOptInterface.PositiveSemidefiniteConeTriangle(2)
Parameter(value)

A short-cut for the MOI.Parameter set.

Example

julia> model = Model();

julia> @variable(model, x in Parameter(2))
x

julia> print(model)
Feasibility
Subject to
 x ∈ MathOptInterface.Parameter{Float64}(2.0)
QuadExpr

An alias for GenericQuadExpr{Float64,VariableRef}, the specific GenericQuadExpr used by JuMP.

QuadTermIterator{GQE<:GenericQuadExpr}

A struct that implements the iterate protocol in order to iterate over tuples of (coefficient, variable, variable) in the GenericQuadExpr.

RotatedSecondOrderCone

Rotated second order cone object that can be used to constrain the square of the euclidean norm of a vector x to be less than or equal to where t and u are nonnegative scalars. This is a shortcut for the MOI.RotatedSecondOrderCone.

Example

The following constrains and :

julia> model = Model();

julia> @variable(model, x)
x

julia> @variable(model, t)
t

julia> @constraint(model, [t, x, x-1, x-2] in RotatedSecondOrderCone())
[t, x, x - 1, x - 2] ∈ MathOptInterface.RotatedSecondOrderCone(4)
SOS1(weights = Real[])

The SOS1 (Special Ordered Set of Type 1) set constrains a vector x to the set where at most one variable can take a non-zero value, and all other elements are zero.

The weights vector, if specified, induces an ordering of the variables; as such, it should contain unique values. The weights vector must have the same number of elements as the vector x, and the element weights[i] corresponds to element x[i]. If not provided, the weights vector defaults to weights[i] = i.

This is a shortcut for the MOI.SOS1 set.

Example

julia> model = Model();

julia> @variable(model, x[1:3] in SOS1([4.1, 3.2, 5.0]))
3-element Vector{VariableRef}:
 x[1]
 x[2]
 x[3]

julia> print(model)
Feasibility
Subject to
 [x[1], x[2], x[3]] ∈ MathOptInterface.SOS1{Float64}([4.1, 3.2, 5.0])
SOS2(weights = Real[])

The SOS2 (Special Ordered Set of Type 2) set constrains a vector x to the set where at most two variables can take a non-zero value, and all other elements are zero. In addition, the two non-zero values must be consecutive given the ordering of the x vector induced by weights.

The weights vector, if specified, induces an ordering of the variables; as such, it must contain unique values. The weights vector must have the same number of elements as the vector x, and the element weights[i] corresponds to element x[i]. If not provided, the weights vector defaults to weights[i] = i.

This is a shortcut for the MOI.SOS2 set.

Example

julia> model = Model();

julia> @variable(model, x[1:3] in SOS2([4.1, 3.2, 5.0]))
3-element Vector{VariableRef}:
 x[1]
 x[2]
 x[3]

julia> print(model)
Feasibility
Subject to
 [x[1], x[2], x[3]] ∈ MathOptInterface.SOS2{Float64}([4.1, 3.2, 5.0])
struct ScalarConstraint

The data for a scalar constraint.

See also the documentation on JuMP’s representation of constraints for more background.

Fields

  • .func: field contains a JuMP object representing the function

  • .set: field contains the MOI set

Example

A scalar constraint:

julia> model = Model();

julia> @variable(model, x);

julia> @constraint(model, c, 2x <= 1)
c : 2 x ≤ 1

julia> object = constraint_object(c)
ScalarConstraint{AffExpr, MathOptInterface.LessThan{Float64}}(2 x, MathOptInterface.LessThan{Float64}(1.0))

julia> typeof(object)
ScalarConstraint{AffExpr, MathOptInterface.LessThan{Float64}}

julia> object.func
2 x

julia> object.set
MathOptInterface.LessThan{Float64}(1.0)
ScalarShape()

An AbstractShape that represents scalar constraints.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> c = @constraint(model, x[2] <= 1);

julia> shape(constraint_object(c))
ScalarShape()
ScalarVariable{S,T,U,V} <: AbstractVariable

A struct used when adding variables.

See also: add_variable.

SecondOrderCone

Second order cone object that can be used to constrain the euclidean norm of a vector x to be less than or equal to a nonnegative scalar t. This is a shortcut for the MOI.SecondOrderCone.

Example

The following constrains and :

julia> model = Model();

julia> @variable(model, x)
x

julia> @variable(model, t)
t

julia> @constraint(model, [t, x-1, x-2] in SecondOrderCone())
[t, x - 1, x - 2] ∈ MathOptInterface.SecondOrderCone(3)
Semicontinuous(lower, upper)

A short-cut for the MOI.Semicontinuous set.

This short-cut is useful because it automatically promotes lower and upper to the same type, and converts them into the element type supported by the JuMP model.

Example

julia> model = Model();

julia> @variable(model, x in Semicontinuous(1, 2))
x

julia> print(model)
Feasibility
Subject to
 x ∈ MathOptInterface.Semicontinuous{Int64}(1, 2)
Semiinteger(lower, upper)

A short-cut for the MOI.Semiinteger set.

This short-cut is useful because it automatically promotes lower and upper to the same type, and converts them into the element type supported by the JuMP model.

Example

julia> model = Model();

julia> @variable(model, x in Semiinteger(3, 5))
x

julia> print(model)
Feasibility
Subject to
 x ∈ MathOptInterface.Semiinteger{Int64}(3, 5)
SensitivityReport
SkewSymmetricMatrixShape

Shape object for a skew symmetric square matrix of side_dimension rows and columns. The vectorized form contains the entries of the upper-right triangular part of the matrix (without the diagonal) given column by column (or equivalently, the entries of the lower-left triangular part given row by row). The diagonal is zero.

SkewSymmetricMatrixSpace()

Use in the @variable macro to constrain a matrix of variables to be skew-symmetric.

Example

julia> model = Model();

julia> @variable(model, Q[1:2, 1:2] in SkewSymmetricMatrixSpace())
2×2 Matrix{AffExpr}:
 0        Q[1,2]
 -Q[1,2]  0
SkipModelConvertScalarSetWrapper(set::MOI.AbstractScalarSet)

JuMP uses model_convert](api.md#JuMP.model_convert-Tuple{AbstractModel, Any}) to automatically promote [MOI.AbstractScalarSet sets to the same value_type as the model.

In cases there this is undesirable, wrap the set in SkipModelConvertScalarSetWrapper to pass the set un-changed to the solver.

This struct is intended for use internally by JuMP extensions. You should not need to use it in regular JuMP code.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @constraint(model, x in MOI.EqualTo(1 // 2))
x = 0.5

julia> @constraint(model, x in SkipModelConvertScalarSetWrapper(MOI.EqualTo(1 // 2)))
x = 1//2
SquareMatrixShape

Shape object for a square matrix of side_dimension rows and columns. The vectorized form contains the entries of the matrix given column by column (or equivalently, the entries of the lower-left triangular part given row by row).

SymmetricMatrixAdjointShape(side_dimension)

This shape is not intended for regular use.

SymmetricMatrixShape(
    side_dimension::Int;
    needs_adjoint_dual::Bool = false,
)

The shape object for a symmetric square matrix of side_dimension rows and columns.

The vectorized form contains the entries of the upper-right triangular part of the matrix given column by column (or equivalently, the entries of the lower-left triangular part given row by row).

needs_adjoint_dual

By default, the dual_shape of SymmetricMatrixShape is also SymmetricMatrixShape. This is true for cases such as a LinearAlgebra.Symmetric matrix in PSDCone.

However, JuMP also supports LinearAlgebra.Symmetric matrix in Zeros, which is interpreted as an element-wise equality constraint. By exploiting symmetry, we pass only the upper triangle of the equality constraints. This works for the primal, but it leads to a factor of 2 difference in the off-diagonal dual elements. (The dual value of the (i, j) element in the triangle formulation should be divided by 2 when spread across the (i, j) and (j, i) elements in the square matrix formulation.) If the constraint has this dual inconsistency, set needs_adjoint_dual = true.

SymmetricMatrixSpace()

Use in the @variable macro to constrain a matrix of variables to be symmetric.

Example

julia> model = Model();

julia> @variable(model, Q[1:2, 1:2] in SymmetricMatrixSpace())
2×2 LinearAlgebra.Symmetric{VariableRef, Matrix{VariableRef}}:
 Q[1,1]  Q[1,2]
 Q[1,2]  Q[2,2]
UnorderedPair(a::T, b::T)

A wrapper type used by GenericQuadExpr with fields .a and .b.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> expr = 2.0 * x[1] * x[2]
2 x[1]*x[2]

julia> expr.terms
OrderedCollections.OrderedDict{UnorderedPair{VariableRef}, Float64} with 1 entry:
  UnorderedPair{VariableRef}(x[1], x[2]) => 2.0
VariableConstrainedOnCreation <: AbstractVariable

Variable scalar_variables constrained to belong to set.

Adding this variable can be understood as doing:

function JuMP.add_variable(
    model::GenericModel,
    variable::VariableConstrainedOnCreation,
    names,
)
    var_ref = add_variable(model, variable.scalar_variable, name)
    add_constraint(model, VectorConstraint(var_ref, variable.set))
    return var_ref
end

but adds the variables with MOI.add_constrained_variable(model, variable.set) instead.

VariableInfo{S,T,U,V}

A struct by JuMP internally when creating variables. This may also be used by JuMP extensions to create new types of variables.

See also: ScalarVariable.

struct VariableNotOwned{V<:AbstractVariableRef} <: Exception
    variable::V
end

The variable variable was used in a model different to owner_model(variable).

VariablesConstrainedOnCreation <: AbstractVariable

Vector of variables scalar_variables constrained to belong to set. Adding this variable can be thought as doing:

function JuMP.add_variable(
    model::GenericModel,
    variable::VariablesConstrainedOnCreation,
    names,
)
    v_names = vectorize(names, variable.shape)
    var_refs = add_variable.(model, variable.scalar_variables, v_names)
    add_constraint(model, VectorConstraint(var_refs, variable.set))
    return reshape_vector(var_refs, variable.shape)
end

but adds the variables with MOI.add_constrained_variables(model, variable.set) instead. See the MOI documentation for the difference between adding the variables with MOI.add_constrained_variables and adding them with MOI.add_variables and adding the constraint separately.

struct VectorConstraint

The data for a vector constraint.

See also the documentation on JuMP’s representation of constraints.

Fields

  • func: field contains a JuMP object representing the function

  • set: field contains the MOI set.

  • shape: field contains an AbstractShape matching the form in which the constraint was constructed (for example, by using matrices or flat vectors).

Example

julia> model = Model();

julia> @variable(model, x[1:3]);

julia> @constraint(model, c, x in SecondOrderCone())
c : [x[1], x[2], x[3]] ∈ MathOptInterface.SecondOrderCone(3)

julia> object = constraint_object(c)
VectorConstraint{VariableRef, MathOptInterface.SecondOrderCone, VectorShape}(VariableRef[x[1], x[2], x[3]], MathOptInterface.SecondOrderCone(3), VectorShape())

julia> typeof(object)
VectorConstraint{VariableRef, MathOptInterface.SecondOrderCone, VectorShape}

julia> object.func
3-element Vector{VariableRef}:
 x[1]
 x[2]
 x[3]

julia> object.set
MathOptInterface.SecondOrderCone(3)

julia> object.shape
VectorShape()
VectorShape()

An AbstractShape that represents vector-valued constraints.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> c = @constraint(model, x in SOS1());

julia> shape(constraint_object(c))
VectorShape()
Zeros()

The JuMP equivalent of the MOI.Zeros set, in which the dimension is inferred from the corresponding function.

Example

julia> model = Model();

julia> @variable(model, x[1:2])
2-element Vector{VariableRef}:
 x[1]
 x[2]

julia> @constraint(model, x in Zeros())
[x[1], x[2]] ∈ Zeros()

julia> A = [1 2; 3 4];

julia> b = [5, 6];

julia> @constraint(model, A * x == b)
[x[1] + 2 x[2] - 5, 3 x[1] + 4 x[2] - 6] ∈ Zeros()
_VariableValueMap{F}

A lazy cache used for computing the primal variable solution in value.

This avoids the need to rewrite the nonlinear expressions from MOI*VARIABLE to VARIABLE, as well as eagerly computing the var*valuefor every variable. We use acache so we don’t have to recompute variables we have already seen.

copy(model::AbstractModel)

Return a copy of the model model. It is similar to copy_model except that it does not return the mapping between the references of model and its copy.

Note

Model copy is not supported in DIRECT mode, that is, when a model is constructed using the direct_model constructor instead of the Model constructor. Moreover, independently on whether an optimizer was provided at model construction, the new model will have no optimizer, that is, an optimizer will have to be provided to the new model in the optimize! call.

Example

In the following example, a model model is constructed with a variable x and a constraint cref. It is then copied into a model new_model with the new references assigned to x_new and cref_new.

julia> model = Model();

julia> @variable(model, x)
x

julia> @constraint(model, cref, x == 2)
cref : x = 2

julia> new_model = copy(model);

julia> x_new = model[:x]
x

julia> cref_new = model[:cref]
cref : x = 2
empty!(model::GenericModel)::GenericModel

Empty the model, that is, remove all variables, constraints and model attributes but not optimizer attributes. Always return the argument.

removes extensions data.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> isempty(model)
false

julia> empty!(model)
A JuMP Model
├ solver: none
├ objective_sense: FEASIBILITY_SENSE
├ num_variables: 0
├ num_constraints: 0
└ Names registered in the model: none

julia> print(model)
Feasibility
Subject to

julia> isempty(model)
true
Base.getindex(m::JuMP.AbstractModel, name::Symbol)

To allow easy accessing of JuMP Variables and Constraints via [] syntax.

Returns the variable, or group of variables, or constraint, or group of constraints, of the given name which were added to the model. This errors if multiple variables or constraints share the same name.

haskey(model::AbstractModel, name::Symbol)

Determine whether the model has a mapping for a given name.

isempty(model::GenericModel)

Verifies whether the model is empty, that is, whether the MOI backend is empty and whether the model is in the same state as at its creation, apart from optimizer attributes.

Example

julia> model = Model();

julia> isempty(model)
true

julia> @variable(model, x[1:2]);

julia> isempty(model)
false
Base.read(
    io::IO,
    ::Type{<:GenericModel};
    format::MOI.FileFormats.FileFormat,
    kwargs...,
)

Return a JuMP model read from io in the format format.

Other kwargs are passed to the Model constructor of the chosen format.

Base.setindex!(m::JuMP.AbstractModel, value, name::Symbol)

Stores the object value in the model m using so that it can be accessed via getindex. Can be called with [] syntax.

Base.show([io::IO], summary::SolutionSummary; verbose::Bool = false)

Write a summary of the solution results to io (or to stdout if io is not given).

Base.write(
    io::IO,
    model::GenericModel;
    format::MOI.FileFormats.FileFormat = MOI.FileFormats.FORMAT_MOF,
    kwargs...,
)

Write the JuMP model model to io in the format format.

Other kwargs are passed to the Model constructor of the chosen format.

BinaryRef(v::GenericVariableRef)

Return a constraint reference to the constraint constraining v to be binary. Errors if one does not exist.

Example

julia> model = Model();

julia> @variable(model, x, Bin);

julia> BinaryRef(x)
x binary
FixRef(v::GenericVariableRef)

Return a constraint reference to the constraint fixing the value of v.

Errors if one does not exist.

See also is_fixed, fix_value, fix, unfix.

Example

julia> model = Model();

julia> @variable(model, x == 1);

julia> FixRef(x)
x = 1
IntegerRef(v::GenericVariableRef)

Return a constraint reference to the constraint constraining v to be integer.

Errors if one does not exist.

Example

julia> model = Model();

julia> @variable(model, x, Int);

julia> IntegerRef(x)
x integer
LowerBoundRef(v::GenericVariableRef)

Return a constraint reference to the lower bound constraint of v.

Errors if one does not exist.

Example

julia> model = Model();

julia> @variable(model, x >= 1.0);

julia> LowerBoundRef(x)
x ≥ 1
NLPEvaluator(
    model::Model,
    _differentiation_backend::MOI.Nonlinear.AbstractAutomaticDifferentiation =
        MOI.Nonlinear.SparseReverseMode(),
)

Return an MOI.AbstractNLPEvaluator constructed from model

Before using, you must initialize the evaluator using MOI.initialize.

Experimental

These features may change or be removed in any future version of JuMP.

Pass _differentiation_backend to specify the differentiation backend used to compute derivatives.

ParameterRef(x::GenericVariableRef)

Return a constraint reference to the constraint constraining x to be a parameter.

Errors if one does not exist.

Example

julia> model = Model();

julia> @variable(model, p in Parameter(2))
p

julia> ParameterRef(p)
p ∈ MathOptInterface.Parameter{Float64}(2.0)

julia> @variable(model, x);

julia> ParameterRef(x)
ERROR: Variable x is not a parameter.
Stacktrace:
[...]
UpperBoundRef(v::GenericVariableRef)

Return a constraint reference to the upper bound constraint of v.

Errors if one does not exist.

Example

julia> model = Model();

julia> @variable(model, x <= 1.0);

julia> UpperBoundRef(x)
x ≤ 1
_compute_rhs_range(d_B, x_B, l_B, u_B, atol)

Assume we start with the optimal solution x_old, we want to compute a step size t in a direction d such that x_new = x_old + t * d is still represented by the same optimal basis. This can be computed a la primal simplex where we use an artificial entering variable.

A * x_new = A * (x_old + t * d)
            = A * x_old + t * A * d
            = 0         + t * A * d  # Since A * x_old = 0
=>  A * d = 0
=> B * d_B + N * d_N = 0
=> d_B = B \ -(N * d_N)

Note we only have to compute the basic component of the direction vector, because d_N is just zeros with a 1 in the component associated with the artificial entering variable. Therefore, all that remains is to compute the associated column of N.

If we are increasing the bounds associated with the ith decision variable, then our artificial entering variable is a duplicate of the ith variable, and N * d_N = A[:, i].

If we are increasing the bounds associated with the ith affine constraint, then our artificial entering variable is a duplicate of the slack variable associated with the ith constraint, that is, a -1 in the ith row and zeros everywhere else.

In either case:

d_B = -(B \ A[:, i])

Now, having computed a direction such that x_new = x_old + t * d. By ensuring that A * d = 0, we maintained structural feasibility. Now we need to compute bounds on t such that x_new maintains bound feasibility. That is, compute bounds on t such that:

l_B[j] <= x_B[j] + t * d_B[j] <= u_B[j].
_desparsify(x)

If x is an AbstractSparseArray, return the dense equivalent, otherwise just return x.

This function is used in _build_constraint.

Why is this needed?

When broadcasting f.(x) over an AbstractSparseArray x, Julia first calls the equivalent of f(zero(eltype(x)). Here’s an example:

julia> import SparseArrays

julia> foo(x) = (println("Calling $(x)"); x)
foo (generic function with 1 method)

julia> foo.(SparseArrays.sparsevec([1, 2], [1, 2]))
Calling 1
Calling 2
2-element SparseArrays.SparseVector{Int64, Int64} with 2 stored entries:
  [1]  =  1
  [2]  =  2

However, if f is mutating, this can have serious consequences! In our case, broadcasting build_constraint will add a new 0 = 0 constraint.

Sparse arrays most-often arise when some input data to the constraint is sparse (for example, a constant vector or matrix). Due to promotion and arithmetic, this results in a constraint function that is represented by an AbstractSparseArray, but is actually dense. Thus, we can safely collect the matrix into a dense array.

If the function is sparse, it’s not obvious what to do. What is the "zero" element of the result? What does it mean to broadcast build_constraint over a sparse array adding scalar constraints? This likely means that the user is using the wrong data structure. For simplicity, let’s also call collect into a dense array, and wait for complaints.

_eval_as_variable(f::F, x::GenericAffExpr, args...) where {F}

In many cases, @variable can return a GenericAffExpr instead of a GenericVariableRef. This is particularly the case for complex-valued expressions. To make common operations like lower_bound(x) work, we should forward the method if and only if x is convertable to a GenericVariableRef.

_fill_vaf!(
    terms::Vector{<:MOI.VectorAffineTerm},
    offset::Int,
    oi::Int,
    aff::AbstractJuMPScalar,
)

Fills the vectors terms at indices starting at offset+1 with the affine terms of aff. The output index for all terms is oi. Return the index of the last term added.

_fill_vqf!(terms::Vector{<:MOI.VectorQuadraticTerm}, offset::Int, oi::Int,
           quad::AbstractJuMPScalar)

Fills the vectors terms at indices starting at offset+1 with the quadratic terms of quad. The output index for all terms is oi. Return the index of the last term added.

_finalize_macro(
    model,
    code,
    source::LineNumberNode;
    register_name::Union{Nothing,Symbol} = nothing,
    wrap_let::Bool = false,
)

Wraps the code generated by a macro in a code block with the first argument as source, the LineNumberNode of where the macro was called from in the user’s code. This results in better stacktraces in error messages.

In addition, this function adds a check that model is a valid AbstractModel.

If register_name is a Symbol, register the result of code in model under the name register_name.

If wrap_let, wraps code in a let model = model block to enforce the model as a local variable.

_is_lp(model::GenericModel)

Return true if model is a linear program.

_moi_quadratic_term(t::Tuple)

Return the MOI.ScalarQuadraticTerm for the quadratic term t, element of the quad_terms iterator. Note that the VariableRefs are transformed into MOI.VariableIndexs hence the owner model information is lost.

_nlp_objective_function(model::GenericModel)

Returns the nonlinear objective function or nothing if no nonlinear objective function is set.

_parse_nonlinear_expression(model::GenericModel, x::Expr)

JuMP needs to build Nonlinear expression objects in macro scope. This has two main challenges:

  1. We need to evaluate local variables into the expressions. This is reasonably easy, anywhere we see a symbol that is not a function call, replace it by esc(x).

  2. We need to identify un-registered user-defined functions so that we can attempt to automatically register them if their symbolic name exists in the scope. I (@odow) originally introduced the auto-registration in https://github.com/jump-dev/JuMP.jl/pull/2537 to fix a common pain-point in JuMP, but after working through this I believe it was a mistake. It’s a lot of hassle! One problem is that the design of Nonlinear has moved the expression parsing from macro-expansion time to runtime. I think this is a big win for readability of the system, but it means we loose access to the caller’s local scope. My solution to maintain backwards compatibility is to check that every function call is registered before parsing the expression.

_print_latex(io::IO, model::AbstractModel)

Print a LaTeX formulation of model to io.

For this method to work, an AbstractModel subtype must implement:

  • objective_function_string

  • constraints_string

  • _nl_subexpression_string

_print_model(io::IO, model::AbstractModel)

Print a plain-text formulation of model to io.

For this method to work, an AbstractModel subtype must implement:

  • objective_function_string

  • constraints_string

  • _nl_subexpression_string

_print_summary(io::IO, model::AbstractModel)

Print a plain-text summary of model to io.

For this method to work, an AbstractModel subtype should implement:

  • name(::AbstractModel)

  • show_objective_function_summary

  • show_constraints_summary

  • show_backend_summary

_replace_zero(model::M, x) where {M<:AbstractModel}

Replaces _MA.Zero with a floating point zero(value_type(M)).

_rewrite_expression(expr)

If expr is not an Expr, then rewriting it won’t do anything. We just need to copy if it is mutable so that future operations do not modify the user’s data.

_rewrite_expression(expr)

A helper function so that we can change how we rewrite expressions in a single place and have it cascade to all locations in the JuMP macros that rewrite expressions.

_standard_form_matrix(model::GenericModel)

See lp_matrix_data instead.

add_bridge(
    model::GenericModel{T},
    BT::Type{<:MOI.Bridges.AbstractBridge};
    coefficient_type::Type{S} = T,
) where {T,S}

Add BT{T} to the list of bridges that can be used to transform unsupported constraints into an equivalent formulation using only constraints supported by the optimizer.

See also: remove_bridge.

Example

julia> model = Model();

julia> add_bridge(model, MOI.Bridges.Constraint.SOCtoNonConvexQuadBridge)

julia> add_bridge(
           model,
           MOI.Bridges.Constraint.NumberConversionBridge;
           coefficient_type = Complex{Float64}
       )
add_constraint(
    model::GenericModel,
    con::AbstractConstraint,
    name::String= "",
)

This method should only be implemented by developers creating JuMP extensions. It should never be called by users of JuMP.

add_nonlinear_constraint(model::Model, expr::Expr)

Add a nonlinear constraint described by the Julia expression ex to model.

This function is most useful if the expression ex is generated programmatically, and you cannot use @NLconstraint.

Совместимость

This function is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling.

Notes

  • You must interpolate the variables directly into the expression expr.

Example

julia> model = Model();

julia> @variable(model, x);

julia> add_nonlinear_constraint(model, :($(x) + $(x)^2 <= 1))
(x + x ^ 2.0) - 1.0 ≤ 0
add_nonlinear_expression(model::Model, expr::Expr)

Add a nonlinear expression expr to model.

This function is most useful if the expression expr is generated programmatically, and you cannot use @NLexpression.

Совместимость

This function is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling.

Notes

  • You must interpolate the variables directly into the expression expr.

Example

julia> model = Model();

julia> @variable(model, x);

julia> add_nonlinear_expression(model, :($(x) + $(x)^2))
subexpression[1]: x + x ^ 2.0
add_nonlinear_operator(
    model::Model,
    dim::Int,
    f::Function,
    [∇f::Function,]
    [∇²f::Function];
    [name::Symbol = Symbol(f),]
)

Add a new nonlinear operator with dim input arguments to model and associate it with the name name.

The function f evaluates the operator and must return a scalar.

The optional function ∇f evaluates the first derivative, and the optional function ∇²f evaluates the second derivative.

∇²f may be provided only if ∇f is also provided.

Univariate syntax

If dim == 1, then the method signatures of each function must be:

  • f(::T)::T where {T<:Real}

  • ∇f(::T)::T where {T<:Real}

  • ∇²f(::T)::T where {T<:Real}

Multivariate syntax

If dim > 1, then the method signatures of each function must be:

  • f(x::T...)::T where {T<:Real}

  • ∇f(g::AbstractVector{T}, x::T...)::Nothing where {T<:Real}

  • ∇²f(H::AbstractMatrix{T}, x::T...)::Nothing where {T<:Real}

Where the gradient vector g and Hessian matrix H are filled in-place. For the Hessian, you must fill in the non-zero lower-triangular entries only. Setting an off-diagonal upper-triangular element may error.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> f(x::Float64) = x^2
f (generic function with 1 method)

julia> ∇f(x::Float64) = 2 * x
∇f (generic function with 1 method)

julia> ∇²f(x::Float64) = 2.0
∇²f (generic function with 1 method)

julia> op_f = add_nonlinear_operator(model, 1, f, ∇f, ∇²f)
NonlinearOperator(f, :f)

julia> @objective(model, Min, op_f(x))
f(x)

julia> op_f(2.0)
4.0
add_nonlinear_parameter(model::Model, value::Real)

Add an anonymous parameter to the model.

Совместимость

This function is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling.

add_to_expression!(expression, terms...)

Updates expression in-place to expression + (*)(terms...).

This is typically much more efficient than expression += (*)(terms...) because it avoids the temorary allocation of the right-hand side term.

For example, add_to_expression!(expression, a, b) produces the same result as expression += a*b, and add_to_expression!(expression, a) produces the same result as expression += a.

When to implement

Only a few methods are defined, mostly for internal use, and only for the cases when:

  1. they can be implemented efficiently

  2. expression is capable of storing the result. For example, add_to_expression!(::AffExpr, ::GenericVariableRef, ::GenericVariableRef) is not defined because a GenericAffExpr cannot store the product of two variables.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> expr = 2 + x
x + 2

julia> add_to_expression!(expr, 3, x)
4 x + 2

julia> expr
4 x + 2
add_to_function_constant(constraint::ConstraintRef, value)

Add value to the function constant term of constraint.

Note that for scalar constraints, JuMP will aggregate all constant terms onto the right-hand side of the constraint so instead of modifying the function, the set will be translated by -value. For example, given a constraint 2x <= 3, add_to_function_constant(c, 4) will modify it to 2x <= -1.

Example

For scalar constraints, the set is translated by -value:

julia> model = Model();

julia> @variable(model, x);

julia> @constraint(model, con, 0 <= 2x - 1 <= 2)
con : 2 x ∈ [1, 3]

julia> add_to_function_constant(con, 4)

julia> con
con : 2 x ∈ [-3, -1]

For vector constraints, the constant is added to the function:

julia> model = Model();

julia> @variable(model, x);

julia> @variable(model, y);

julia> @constraint(model, con, [x + y, x, y] in SecondOrderCone())
con : [x + y, x, y] ∈ MathOptInterface.SecondOrderCone(3)

julia> add_to_function_constant(con, [1, 2, 2])

julia> con
con : [x + y + 1, x + 2, y + 2] ∈ MathOptInterface.SecondOrderCone(3)
add_variable(m::GenericModel, v::AbstractVariable, name::String = "")

This method should only be implemented by developers creating JuMP extensions. It should never be called by users of JuMP.

all_constraints(model::GenericModel, function_type, set_type)::Vector{<:ConstraintRef}

Return a list of all constraints currently in the model where the function has type function_type and the set has type set_type. The constraints are ordered by creation time.

Example

julia> model = Model();

julia> @variable(model, x >= 0, Bin);

julia> @constraint(model, 2x <= 1);

julia> all_constraints(model, VariableRef, MOI.GreaterThan{Float64})
1-element Vector{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.VariableIndex, MathOptInterface.GreaterThan{Float64}}, ScalarShape}}:
 x ≥ 0

julia> all_constraints(model, VariableRef, MOI.ZeroOne)
1-element Vector{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.VariableIndex, MathOptInterface.ZeroOne}, ScalarShape}}:
 x binary

julia> all_constraints(model, AffExpr, MOI.LessThan{Float64})
1-element Vector{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.LessThan{Float64}}, ScalarShape}}:
 2 x ≤ 1
all_constraints(
    model::GenericModel;
    include_variable_in_set_constraints::Bool,
)::Vector{ConstraintRef}

Return a list of all constraints in model.

If include_variable_in_set_constraints == true, then VariableRef constraints such as VariableRef-in-Integer are included. To return only the structural constraints (for example, the rows in the constraint matrix of a linear program), pass include_variable_in_set_constraints = false.

Example

julia> model = Model();

julia> @variable(model, x >= 0, Int);

julia> @constraint(model, 2x <= 1);

julia> @NLconstraint(model, x^2 <= 1);

julia> all_constraints(model; include_variable_in_set_constraints = true)
4-element Vector{ConstraintRef}:
 2 x ≤ 1
 x ≥ 0
 x integer
 x ^ 2.0 - 1.0 ≤ 0

julia> all_constraints(model; include_variable_in_set_constraints = false)
2-element Vector{ConstraintRef}:
 2 x ≤ 1
 x ^ 2.0 - 1.0 ≤ 0

Performance considerations

Note that this function is type-unstable because it returns an abstractly typed vector. If performance is a problem, consider using list_of_constraint_types and a function barrier. See the Performance tips for extensions section of the documentation for more details.

all_nonlinear_constraints(model::GenericModel)

Return a vector of all nonlinear constraint references in the model in the order they were added to the model.

Совместимость

This function is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling.

This function returns only the constraints added with @NLconstraint and add_nonlinear_constraint. It does not return GenericNonlinearExpr constraints.

all_variables(model::GenericModel{T})::Vector{GenericVariableRef{T}} where {T}

Returns a list of all variables currently in the model. The variables are ordered by creation time.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @variable(model, y);

julia> all_variables(model)
2-element Vector{VariableRef}:
 x
 y
anonymous_name(::MIME, x::AbstractVariableRef)

The name to use for an anonymous variable x when printing.

Example

julia> model = Model();

julia> x = @variable(model);

julia> anonymous_name(MIME("text/plain"), x)
"_[1]"
backend(model::GenericModel)

Return the lower-level MathOptInterface model that sits underneath JuMP. This model depends on which operating mode JuMP is in (see mode).

  • If JuMP is in DIRECT mode (that is, the model was created using direct_model), the backend will be the optimizer passed to direct_model.

  • If JuMP is in MANUAL or AUTOMATIC mode, the backend is a MOI.Utilities.CachingOptimizer.

Use index to get the index of a variable or constraint in the backend model.

This function should only be used by advanced users looking to access low-level MathOptInterface or solver-specific functionality.

Notes

If JuMP is not in DIRECT mode, the type returned by backend may change between any JuMP releases. Therefore, only use the public API exposed by MathOptInterface, and do not access internal fields. If you require access to the innermost optimizer, see unsafe_backend. Alternatively, use direct_model to create a JuMP model in DIRECT mode.

See also: unsafe_backend.

Example

julia> import HiGHS

julia> model = direct_model(HiGHS.Optimizer());

julia> set_silent(model)

julia> @variable(model, x >= 0)
x

julia> highs = backend(model)
A HiGHS model with 1 columns and 0 rows.

julia> index(x)
MOI.VariableIndex(1)
barrier_iterations(model::GenericModel)

If available, returns the cumulative number of barrier iterations during the most-recent optimization (the MOI.BarrierIterations attribute).

Throws a MOI.GetAttributeNotAllowed error if the attribute is not implemented by the solver.

Example

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_silent(model)

julia> optimize!(model)

julia> barrier_iterations(model)
0
bridge_constraints(model::GenericModel)

When in direct mode, return false.

When in manual or automatic mode, return a Bool indicating whether the optimizer is set and unsupported constraints are automatically bridged to equivalent supported constraints when an appropriate transformation is available.

Example

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> bridge_constraints(model)
true

julia> model = Model(Ipopt.Optimizer; add_bridges = false);

julia> bridge_constraints(model)
false
build_constraint(error_fn::Function, func, set, args...; kwargs...)

This method should only be implemented by developers creating JuMP extensions. It should never be called by users of JuMP.

build_variable(
    error_fn::Function,
    info::VariableInfo,
    args...;
    kwargs...,
)

Return a new AbstractVariable object.

This method should only be implemented by developers creating JuMP extensions. It should never be called by users of JuMP.

Arguments

  • error_fn: a function to call instead of error. error_fn annotates the error message with additional information for the user.

  • info: an instance of VariableInfo. This has a variety of fields relating to the variable such as info.lower_bound and info.binary.

  • args: optional additional positional arguments for extending the @variable macro.

  • kwargs: optional keyword arguments for extending the @variable macro.

See also: @variable

Extensions should define a method with ONE positional argument to dispatch the call to a different method. Creating an extension that relies on multiple positional arguments leads to MethodErrors if the user passes the arguments in the wrong order.

Example

@variable(model, x, Foo)

will call

build_variable(error_fn::Function, info::VariableInfo, ::Type{Foo})

Passing special-case positional arguments such as Bin, Int, and PSD is okay, along with keyword arguments:

@variable(model, x, Int, Foo(), mykwarg = true)
# or
@variable(model, x, Foo(), Int, mykwarg = true)

will call

build_variable(error_fn::Function, info::VariableInfo, ::Foo; mykwarg)

and info.integer will be true.

Note that the order of the positional arguments does not matter.

callback_node_status(cb_data, model::GenericModel)

Return an MOI.CallbackNodeStatusCode enum, indicating if the current primal solution available from callback_value is integer feasible.

Example

julia> import GLPK

julia> model = Model(GLPK.Optimizer);

julia> @variable(model, x <= 10, Int);

julia> @objective(model, Max, x);

julia> function my_callback_function(cb_data)
           status = callback_node_status(cb_data, model)
           println("Status is: ", status)
           return
       end
my_callback_function (generic function with 1 method)

julia> set_attribute(model, GLPK.CallbackFunction(), my_callback_function)

julia> optimize!(model)
Status is: CALLBACK_NODE_STATUS_UNKNOWN
Status is: CALLBACK_NODE_STATUS_UNKNOWN
Status is: CALLBACK_NODE_STATUS_INTEGER
Status is: CALLBACK_NODE_STATUS_INTEGER
callback_value(cb_data, x::GenericVariableRef)
callback_value(cb_data, x::Union{GenericAffExpr,GenericQuadExpr})

Return the primal solution of x inside a callback.

cb_data is the argument to the callback function, and the type is dependent on the solver.

Use callback_node_status to check whether a solution is available.

Example

julia> import GLPK

julia> model = Model(GLPK.Optimizer);

julia> @variable(model, x <= 10, Int);

julia> @objective(model, Max, x);

julia> function my_callback_function(cb_data)
           status = callback_node_status(cb_data, model)
           if status == MOI.CALLBACK_NODE_STATUS_INTEGER
               println("Solution is: ", callback_value(cb_data, x))
           end
           return
       end
my_callback_function (generic function with 1 method)

julia> set_attribute(model, GLPK.CallbackFunction(), my_callback_function)

julia> optimize!(model)
Solution is: 10.0
Solution is: 10.0
check_belongs_to_model(x::AbstractJuMPScalar, model::AbstractModel)
check_belongs_to_model(x::AbstractConstraint, model::AbstractModel)

Throw VariableNotOwned if the owner_model of x is not model.

Example

julia> model = Model();

julia> @variable(model, x);

julia> check_belongs_to_model(x, model)

julia> model_2 = Model();

julia> check_belongs_to_model(x, model_2)
ERROR: VariableNotOwned{VariableRef}(x): the variable x cannot be used in this model because
it belongs to a different model.
[...]
check_belongs_to_model(con_ref::ConstraintRef, model::AbstractModel)

Throw ConstraintNotOwned if owner_model(con_ref) is not model.

coefficient(v1::GenericVariableRef{T}, v2::GenericVariableRef{T}) where {T}

Return one(T) if v1 == v2 and zero(T) otherwise.

This is a fallback for other coefficient methods to simplify code in which the expression may be a single variable.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> coefficient(x[1], x[1])
1.0

julia> coefficient(x[1], x[2])
0.0
coefficient(a::GenericAffExpr{C,V}, v::V) where {C,V}

Return the coefficient associated with variable v in the affine expression a.

Example

julia> model = Model();

julia> @variable(model, x);

julia> expr = 2.0 * x + 1.0;

julia> coefficient(expr, x)
2.0
coefficient(a::GenericQuadExpr{C,V}, v1::V, v2::V) where {C,V}

Return the coefficient associated with the term v1 * v2 in the quadratic expression a.

Note that coefficient(a, v1, v2) is the same as coefficient(a, v2, v1).

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> expr = 2.0 * x[1] * x[2];

julia> coefficient(expr, x[1], x[2])
2.0

julia> coefficient(expr, x[2], x[1])
2.0

julia> coefficient(expr, x[1], x[1])
0.0
coefficient(a::GenericQuadExpr{C,V}, v::V) where {C,V}

Return the coefficient associated with variable v in the affine component of a.

Example

julia> model = Model();

julia> @variable(model, x);

julia> expr = 2.0 * x^2 + 3.0 * x;

julia> coefficient(expr, x)
3.0
compute_conflict!(model::GenericModel)

Compute a conflict if the model is infeasible.

The conflict is also called the Irreducible Infeasible Subsystem (IIS).

If an optimizer has not been set yet (see set_optimizer), a NoOptimizer error is thrown.

The status of the conflict can be checked with the MOI.ConflictStatus model attribute. Then, the status for each constraint can be queried with the MOI.ConstraintConflictStatus attribute.

See also: copy_conflict

Example

julia> using JuMP

julia> model = Model(Gurobi.Optimizer);

julia> set_silent(model)

julia> @variable(model, x >= 0);

julia> @constraint(model, c1, x >= 2);

julia> @constraint(model, c2, x <= 1);

julia> optimize!(model)

julia> compute_conflict!(model)

julia> get_attribute(model, MOI.ConflictStatus())
CONFLICT_FOUND::ConflictStatusCode = 3
constant(aff::GenericAffExpr{C,V})::C

Return the constant of the affine expression.

Example

julia> model = Model();

julia> @variable(model, x);

julia> aff = 2.0 * x + 3.0;

julia> constant(aff)
3.0
constant(quad::GenericQuadExpr{C,V})::C

Return the constant of the quadratic expression.

Example

julia> model = Model();

julia> @variable(model, x);

julia> quad = 2.0 * x^2 + 3.0;

julia> constant(quad)
3.0
constraint_by_name(model::AbstractModel, name::String, [F, S])::Union{ConstraintRef,Nothing}

Return the reference of the constraint with name attribute name or Nothing if no constraint has this name attribute.

Throws an error if several constraints have name as their name attribute.

If F and S are provided, this method addititionally throws an error if the constraint is not an F-in-S contraint where F is either the JuMP or MOI type of the function and S is the MOI type of the set.

Providing F and S is recommended if you know the type of the function and set since its returned type can be inferred while for the method above (that is, without F and S), the exact return type of the constraint index cannot be inferred.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> @constraint(model, con, x^2 == 1)
con : x² = 1

julia> constraint_by_name(model, "kon")

julia> constraint_by_name(model, "con")
con : x² = 1

julia> constraint_by_name(model, "con", AffExpr, MOI.EqualTo{Float64})

julia> constraint_by_name(model, "con", QuadExpr, MOI.EqualTo{Float64})
con : x² = 1
constraint_object(con_ref::ConstraintRef)

Return the underlying constraint data for the constraint referenced by con_ref.

Example

A scalar constraint:

julia> model = Model();

julia> @variable(model, x);

julia> @constraint(model, c, 2x <= 1)
c : 2 x ≤ 1

julia> object = constraint_object(c)
ScalarConstraint{AffExpr, MathOptInterface.LessThan{Float64}}(2 x, MathOptInterface.LessThan{Float64}(1.0))

julia> typeof(object)
ScalarConstraint{AffExpr, MathOptInterface.LessThan{Float64}}

julia> object.func
2 x

julia> object.set
MathOptInterface.LessThan{Float64}(1.0)

A vector constraint:

julia> model = Model();

julia> @variable(model, x[1:3]);

julia> @constraint(model, c, x in SecondOrderCone())
c : [x[1], x[2], x[3]] ∈ MathOptInterface.SecondOrderCone(3)

julia> object = constraint_object(c)
VectorConstraint{VariableRef, MathOptInterface.SecondOrderCone, VectorShape}(VariableRef[x[1], x[2], x[3]], MathOptInterface.SecondOrderCone(3), VectorShape())

julia> typeof(object)
VectorConstraint{VariableRef, MathOptInterface.SecondOrderCone, VectorShape}

julia> object.func
3-element Vector{VariableRef}:
 x[1]
 x[2]
 x[3]

julia> object.set
MathOptInterface.SecondOrderCone(3)
constraint_ref_with_index(model::AbstractModel, index::MOI.ConstraintIndex)

Return a ConstraintRef of model corresponding to index.

This function is a helper function used internally by JuMP and some JuMP extensions. It should not need to be called in user-code.

constraint_string(
    mode::MIME,
    ref::ConstraintRef;
    in_math_mode::Bool = false,
)

Return a string representation of the constraint ref, given the mode.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @constraint(model, c, 2 * x <= 1);

julia> constraint_string(MIME("text/plain"), c)
"c : 2 x ≤ 1"
constraints_string(mode, model::AbstractModel)::Vector{String}

Return a list of Strings describing each constraint of the model.

Example

julia> model = Model();

julia> @variable(model, x >= 0);

julia> @constraint(model, c, 2 * x <= 1);

julia> constraints_string(MIME("text/plain"), model)
2-element Vector{String}:
 "c : 2 x ≤ 1"
 "x ≥ 0"
copy_conflict(model::GenericModel)

Return a copy of the current conflict for the model model and a GenericReferenceMap that can be used to obtain the variable and constraint reference of the new model corresponding to a given model's reference.

This is a convenience function that provides a filtering function for copy_model.

Note

Model copy is not supported in DIRECT mode, that is, when a model is constructed using the direct_model constructor instead of the Model constructor. Moreover, independently on whether an optimizer was provided at model construction, the new model will have no optimizer, that is, an optimizer will have to be provided to the new model in the optimize! call.

Example

In the following example, a model model is constructed with a variable x and two constraints c1 and c2. This model has no solution, as the two constraints are mutually exclusive. The solver is asked to compute a conflict with compute_conflict!. The parts of model participating in the conflict are then copied into a model iis_model.

julia> using JuMP

julia> import Gurobi

julia> model = Model(Gurobi.Optimizer);

julia> set_silent(model)

julia> @variable(model, x >= 0)
x

julia> @constraint(model, c1, x >= 2)
c1 : x ≥ 2

julia> @constraint(model, c2, x <= 1)
c2 : x ≤ 1

julia> optimize!(model)

julia> compute_conflict!(model)

julia> if get_attribute(model, MOI.ConflictStatus()) == MOI.CONFLICT_FOUND
           iis_model, reference_map = copy_conflict(model)
           print(iis_model)
       end
Feasibility
Subject to
 c1 : x ≥ 2
 c2 : x ≤ 1
copy_extension_data(data, new_model::AbstractModel, model::AbstractModel)

Return a copy of the extension data data of the model model to the extension data of the new model new_model.

A method should be added for any JuMP extension storing data in the ext field.

This method should only be implemented by developers creating JuMP extensions. It should never be called by users of JuMP.

Do not engage in type piracy by implementing this method for types of data that you did not define! JuMP extensions should store types that they define in model.ext, rather than regular Julia types.

copy_model(model::GenericModel; filter_constraints::Union{Nothing, Function}=nothing)

Return a copy of the model model and a GenericReferenceMap that can be used to obtain the variable and constraint reference of the new model corresponding to a given model's reference. A Base.copy(::AbstractModel) method has also been implemented, it is similar to copy_model but does not return the reference map.

If the filter_constraints argument is given, only the constraints for which this function returns true will be copied. This function is given a constraint reference as argument.

Note

Model copy is not supported in DIRECT mode, that is, when a model is constructed using the direct_model constructor instead of the Model constructor. Moreover, independently on whether an optimizer was provided at model construction, the new model will have no optimizer, that is, an optimizer will have to be provided to the new model in the optimize! call.

Example

In the following example, a model model is constructed with a variable x and a constraint cref. It is then copied into a model new_model with the new references assigned to x_new and cref_new.

julia> model = Model();

julia> @variable(model, x)
x

julia> @constraint(model, cref, x == 2)
cref : x = 2

julia> new_model, reference_map = copy_model(model);

julia> x_new = reference_map[x]
x

julia> cref_new = reference_map[cref]
cref : x = 2
delete(model::GenericModel, con_ref::ConstraintRef)

Delete the constraint associated with constraint_ref from the model model.

Note that delete does not unregister the name from the model, so adding a new constraint of the same name will throw an error. Use unregister to unregister the name after deletion.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @constraint(model, c, 2x <= 1)
c : 2 x ≤ 1

julia> delete(model, c)

julia> unregister(model, :c)

julia> print(model)
Feasibility
Subject to

julia> model[:c]
ERROR: KeyError: key :c not found
Stacktrace:
[...]
delete(model::GenericModel, variable_ref::GenericVariableRef)

Delete the variable associated with variable_ref from the model model.

Note that delete does not unregister the name from the model, so adding a new variable of the same name will throw an error. Use unregister to unregister the name after deletion.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> delete(model, x)

julia> unregister(model, :x)

julia> print(model)
Feasibility
Subject to

julia> model[:x]
ERROR: KeyError: key :x not found
Stacktrace:
[...]
delete(model::GenericModel, con_refs::Vector{<:ConstraintRef})

Delete the constraints associated with con_refs from the model model.

Solvers may implement specialized methods for deleting multiple constraints of the same concrete type. These methods may be more efficient than repeatedly calling the single constraint delete method.

See also: unregister

Example

julia> model = Model();

julia> @variable(model, x[1:3]);

julia> @constraint(model, c, 2 * x .<= 1)
3-element Vector{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.LessThan{Float64}}, ScalarShape}}:
 c : 2 x[1] ≤ 1
 c : 2 x[2] ≤ 1
 c : 2 x[3] ≤ 1

julia> delete(model, c)

julia> unregister(model, :c)

julia> print(model)
Feasibility
Subject to

julia> model[:c]
ERROR: KeyError: key :c not found
Stacktrace:
[...]
delete(model::GenericModel, variable_refs::Vector{<:GenericVariableRef})

Delete the variables associated with variable_refs from the model model. Solvers may implement methods for deleting multiple variables that are more efficient than repeatedly calling the single variable delete method.

See also: unregister

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> delete(model, x)

julia> unregister(model, :x)

julia> print(model)
Feasibility
Subject to

julia> model[:x]
ERROR: KeyError: key :x not found
Stacktrace:
[...]
delete_lower_bound(v::GenericVariableRef)

Delete the lower bound constraint of a variable.

Example

julia> model = Model();

julia> @variable(model, x >= 1.0);

julia> has_lower_bound(x)
true

julia> delete_lower_bound(x)

julia> has_lower_bound(x)
false
delete_upper_bound(v::GenericVariableRef)

Delete the upper bound constraint of a variable.

Errors if one does not exist.

Example

julia> model = Model();

julia> @variable(model, x <= 1.0);

julia> has_upper_bound(x)
true

julia> delete_upper_bound(x)

julia> has_upper_bound(x)
false
direct_generic_model(
    value_type::Type{T},
    backend::MOI.ModelLike;
) where {T<:Real}

Return a new JuMP model using backend to store the model and solve it.

As opposed to the Model constructor, no cache of the model is stored outside of backend and no bridges are automatically applied to backend.

Notes

The absence of a cache reduces the memory footprint but, it is important to bear in mind the following implications of creating models using this direct mode:

  • When backend does not support an operation, such as modifying constraints or adding variables/constraints after solving, an error is thrown. For models created using the Model constructor, such situations can be dealt with by storing the modifications in a cache and loading them into the optimizer when optimize! is called.

  • No constraint bridging is supported by default.

  • The optimizer used cannot be changed the model is constructed.

  • The model created cannot be copied.

direct_generic_model(::Type{T}, factory::MOI.OptimizerWithAttributes)

Create a direct_generic_model using factory, a MOI.OptimizerWithAttributes object created by optimizer_with_attributes.

Example

julia> import HiGHS

julia> optimizer = optimizer_with_attributes(
           HiGHS.Optimizer,
           "presolve" => "off",
           MOI.Silent() => true,
       );

julia> model = direct_generic_model(Float64, optimizer)
A JuMP Model
├ mode: DIRECT
├ solver: HiGHS
├ objective_sense: FEASIBILITY_SENSE
├ num_variables: 0
├ num_constraints: 0
└ Names registered in the model: none

is equivalent to:

julia> import HiGHS

julia> model = direct_generic_model(Float64, HiGHS.Optimizer())
A JuMP Model
├ mode: DIRECT
├ solver: HiGHS
├ objective_sense: FEASIBILITY_SENSE
├ num_variables: 0
├ num_constraints: 0
└ Names registered in the model: none

julia> set_attribute(model, "presolve", "off")

julia> set_attribute(model, MOI.Silent(), true)
direct_model(backend::MOI.ModelLike)

Return a new JuMP model using backend to store the model and solve it.

As opposed to the Model constructor, no cache of the model is stored outside of backend and no bridges are automatically applied to backend.

Notes

The absence of a cache reduces the memory footprint but, it is important to bear in mind the following implications of creating models using this direct mode:

  • When backend does not support an operation, such as modifying constraints or adding variables/constraints after solving, an error is thrown. For models created using the Model constructor, such situations can be dealt with by storing the modifications in a cache and loading them into the optimizer when optimize! is called.

  • No constraint bridging is supported by default.

  • The optimizer used cannot be changed the model is constructed.

  • The model created cannot be copied.

direct_model(factory::MOI.OptimizerWithAttributes)

Create a direct_model using factory, a MOI.OptimizerWithAttributes object created by optimizer_with_attributes.

Example

julia> import HiGHS

julia> optimizer = optimizer_with_attributes(
           HiGHS.Optimizer,
           "presolve" => "off",
           MOI.Silent() => true,
       );

julia> model = direct_model(optimizer)
A JuMP Model
├ mode: DIRECT
├ solver: HiGHS
├ objective_sense: FEASIBILITY_SENSE
├ num_variables: 0
├ num_constraints: 0
└ Names registered in the model: none

is equivalent to:

julia> import HiGHS

julia> model = direct_model(HiGHS.Optimizer())
A JuMP Model
├ mode: DIRECT
├ solver: HiGHS
├ objective_sense: FEASIBILITY_SENSE
├ num_variables: 0
├ num_constraints: 0
└ Names registered in the model: none

julia> set_attribute(model, "presolve", "off")

julia> set_attribute(model, MOI.Silent(), true)
drop_zeros!(expr::GenericAffExpr)

Remove terms in the affine expression with 0 coefficients.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> expr = x[1] + x[2];

julia> add_to_expression!(expr, -1.0, x[1])
0 x[1] + x[2]

julia> drop_zeros!(expr)

julia> expr
x[2]
drop_zeros!(expr::GenericQuadExpr)

Remove terms in the quadratic expression with 0 coefficients.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> expr = x[1]^2 + x[2]^2;

julia> add_to_expression!(expr, -1.0, x[1], x[1])
0 x[1]² + x[2]²

julia> drop_zeros!(expr)

julia> expr
x[2]²
dual(con_ref::ConstraintRef; result::Int = 1)

Return the dual value of constraint con_ref associated with result index result of the most-recent solution returned by the solver.

Use has_duals to check if a result exists before asking for values.

Example

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_silent(model)

julia> @variable(model, x);

julia> @constraint(model, c, x <= 1)
c : x ≤ 1

julia> @objective(model, Max, 2 * x + 1);

julia> optimize!(model)

julia> has_duals(model)
true

julia> dual(c)
-2.0
dual_objective_value(model::GenericModel; result::Int = 1)

Return the value of the objective of the dual problem associated with result index result of the most-recent solution returned by the solver.

Throws MOI.UnsupportedAttribute{MOI.DualObjectiveValue} if the solver does not support this attribute.

This function is equivalent to querying the MOI.DualObjectiveValue attribute.

See also: result_count.

Example

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_silent(model)

julia> @variable(model, x >= 1);

julia> @objective(model, Min, 2 * x + 1);

julia> optimize!(model)

julia> dual_objective_value(model)
3.0

julia> dual_objective_value(model; result = 2)
ERROR: Result index of attribute MathOptInterface.DualObjectiveValue(2) out of bounds. There are currently 1 solution(s) in the model.
Stacktrace:
[...]
dual_shape(shape::AbstractShape)::AbstractShape

Returns the shape of the dual space of the space of objects of shape shape. By default, the dual_shape of a shape is itself. See the examples section below for an example for which this is not the case.

Example

Consider polynomial constraints for which the dual is moment constraints and moment constraints for which the dual is polynomial constraints. Shapes for polynomials can be defined as follows:

struct Polynomial
    coefficients::Vector{Float64}
    monomials::Vector{Monomial}
end
struct PolynomialShape <: AbstractShape
    monomials::Vector{Monomial}
end
JuMP.reshape_vector(x::Vector, shape::PolynomialShape) = Polynomial(x, shape.monomials)

and a shape for moments can be defined as follows:

struct Moments
    coefficients::Vector{Float64}
    monomials::Vector{Monomial}
end
struct MomentsShape <: AbstractShape
    monomials::Vector{Monomial}
end
JuMP.reshape_vector(x::Vector, shape::MomentsShape) = Moments(x, shape.monomials)

Then dual_shape allows the definition of the shape of the dual of polynomial and moment constraints:

dual_shape(shape::PolynomialShape) = MomentsShape(shape.monomials)
dual_shape(shape::MomentsShape) = PolynomialShape(shape.monomials)
dual_start_value(con_ref::ConstraintRef)

Return the dual start value (MOI attribute ConstraintDualStart) of the constraint con_ref.

If no dual start value has been set, dual_start_value will return nothing.

Example

julia> model = Model();

julia> @variable(model, x, start = 2.0);

julia> @constraint(model, c, [2x] in Nonnegatives())
c : [2 x] ∈ Nonnegatives()

julia> set_dual_start_value(c, [0.0])

julia> dual_start_value(c)
1-element Vector{Float64}:
 0.0

julia> set_dual_start_value(c, nothing)

julia> dual_start_value(c)
dual_status(model::GenericModel; result::Int = 1)

Return a MOI.ResultStatusCode describing the status of the most recent dual solution of the solver (that is, the MOI.DualStatus attribute) associated with the result index result.

See also: result_count.

Example

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> dual_status(model; result = 2)
NO_SOLUTION::ResultStatusCode = 0
error_if_direct_mode(model::GenericModel, func::Symbol)

Errors if model is in direct mode during a call from the function named func.

Used internally within JuMP, or by JuMP extensions who do not want to support models in direct mode.

Example

julia> import HiGHS

julia> model = direct_model(HiGHS.Optimizer());

julia> error_if_direct_mode(model, :foo)
ERROR: The `foo` function is not supported in DIRECT mode.
Stacktrace:
[...]
fix(v::GenericVariableRef, value::Number; force::Bool = false)

Fix a variable to a value. Update the fixing constraint if one exists, otherwise create a new one.

If the variable already has variable bounds and force=false, calling fix will throw an error. If force=true, existing variable bounds will be deleted, and the fixing constraint will be added. Note a variable will have no bounds after a call to unfix.

Example

julia> model = Model();

julia> @variable(model, x);

julia> is_fixed(x)
false

julia> fix(x, 1.0)

julia> is_fixed(x)
true
julia> model = Model();

julia> @variable(model, 0 <= x <= 1);

julia> is_fixed(x)
false

julia> fix(x, 1.0; force = true)

julia> is_fixed(x)
true
fix_discrete_variables([var_value::Function = value,] model::GenericModel)

Modifies model to convert all binary and integer variables to continuous variables with fixed bounds of var_value(x).

Return

Returns a function that can be called without any arguments to restore the original model. The behavior of this function is undefined if additional changes are made to the affected variables in the meantime.

Notes

  • An error is thrown if semi-continuous or semi-integer constraints are present (support may be added for these in the future).

  • All other constraints are ignored (left in place). This includes discrete constraints like SOS and indicator constraints.

Example

julia> model = Model();

julia> @variable(model, x, Bin, start = 1);

julia> @variable(model, 1 <= y <= 10, Int, start = 2);

julia> @objective(model, Min, x + y);

julia> undo_relax = fix_discrete_variables(start_value, model);

julia> print(model)
Min x + y
Subject to
 x = 1
 y = 2

julia> undo_relax()

julia> print(model)
Min x + y
Subject to
 y ≥ 1
 y ≤ 10
 y integer
 x binary
fix_value(v::GenericVariableRef)

Return the value to which a variable is fixed.

Error if one does not exist.

See also FixRef, is_fixed, fix, unfix.

Example

julia> model = Model();

julia> @variable(model, x == 1);

julia> fix_value(x)
1.0
flatten!(expr::GenericNonlinearExpr)

Flatten a nonlinear expression in-place by lifting nested + and * nodes into a single n-ary operation.

Motivation

Nonlinear expressions created using operator overloading can be deeply nested and unbalanced. For example, prod(x for i in 1:4) creates (x, *(x, *(x, x))) instead of the more preferable (x, x, x, x).

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> y = prod(x for i in 1:4)
((x²) * x) * x

julia> flatten!(y)
(x²) * x * x

julia> flatten!(sin(prod(x for i in 1:4)))
sin((x²) * x * x)
function_string(
    mode::MIME,
    func::Union{JuMP.AbstractJuMPScalar,Vector{<:JuMP.AbstractJuMPScalar}},
)

Return a String representing the function func using print mode mode.

Example

julia> model = Model();

julia> @variable(model, x);

julia> function_string(MIME("text/plain"), 2 * x + 1)
"2 x + 1"
get_attribute(model::GenericModel, attr::MOI.AbstractModelAttribute)
get_attribute(x::GenericVariableRef, attr::MOI.AbstractVariableAttribute)
get_attribute(cr::ConstraintRef, attr::MOI.AbstractConstraintAttribute)

Get the value of a solver-specifc attribute attr.

This is equivalent to calling MOI.get](api.md#MathOptInterface.get-Tuple{GenericModel, MathOptInterface.AbstractModelAttribute}) with the associated MOI model and, for variables and constraints, with the associated [MOI.VariableIndex or MOI.ConstraintIndex.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> @constraint(model, c, 2 * x <= 1)
c : 2 x ≤ 1

julia> get_attribute(model, MOI.Name())
""

julia> get_attribute(x, MOI.VariableName())
"x"

julia> get_attribute(c, MOI.ConstraintName())
"c"
get_attribute(
    model::Union{GenericModel,MOI.OptimizerWithAttributes},
    attr::Union{AbstractString,MOI.AbstractOptimizerAttribute},
)

Get the value of a solver-specifc attribute attr.

This is equivalent to calling MOI.get with the associated MOI model.

If attr is an AbstractString, it is converted to MOI.RawOptimizerAttribute.

Example

julia> import HiGHS

julia> opt = optimizer_with_attributes(HiGHS.Optimizer, "output_flag" => true);

julia> model = Model(opt);

julia> get_attribute(model, "output_flag")
true

julia> get_attribute(model, MOI.RawOptimizerAttribute("output_flag"))
true

julia> get_attribute(opt, "output_flag")
true

julia> get_attribute(opt, MOI.RawOptimizerAttribute("output_flag"))
true
get_optimizer_attribute(
    model::Union{GenericModel,MOI.OptimizerWithAttributes},
    attr::Union{AbstractString,MOI.AbstractOptimizerAttribute},
)

Return the value associated with the solver-specific attribute attr.

If attr is an AbstractString, this is equivalent to get_optimizer_attribute(model, MOI.RawOptimizerAttribute(name)).

Совместимость

This method will remain in all v1.X releases of JuMP, but it may be removed in a future v2.0 release. We recommend using get_attribute instead.

Example

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> get_optimizer_attribute(model, MOI.Silent())
false
has_duals(model::GenericModel; result::Int = 1)

Return true if the solver has a dual solution in result index result available to query, otherwise return false.

See also dual, shadow_price, and result_count.

Example

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_silent(model)

julia> @variable(model, x);

julia> @constraint(model, c, x <= 1)
c : x ≤ 1

julia> @objective(model, Max, 2 * x + 1);

julia> has_duals(model)
false

julia> optimize!(model)

julia> has_duals(model)
true
has_lower_bound(v::GenericVariableRef)

Return true if v has a lower bound. If true, the lower bound can be queried with lower_bound.

Example

julia> model = Model();

julia> @variable(model, x >= 1.0);

julia> has_lower_bound(x)
true
has_start_value(variable::AbstractVariableRef)

Return true if the variable has a start value set, otherwise return false.

Example

julia> model = Model();

julia> @variable(model, x, start = 1.5);

julia> @variable(model, y);

julia> has_start_value(x)
true

julia> has_start_value(y)
false

julia> start_value(x)
1.5

julia> set_start_value(y, 2.0)

julia> has_start_value(y)
true

julia> start_value(y)
2.0
has_upper_bound(v::GenericVariableRef)

Return true if v has a upper bound. If true, the upper bound can be queried with upper_bound.

Example

julia> model = Model();

julia> @variable(model, x <= 1.0);

julia> has_upper_bound(x)
true
has_values(model::GenericModel; result::Int = 1)

Return true if the solver has a primal solution in result index result available to query, otherwise return false.

See also value and result_count.

Example

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_silent(model)

julia> @variable(model, x);

julia> @constraint(model, c, x <= 1)
c : x ≤ 1

julia> @objective(model, Max, 2 * x + 1);

julia> has_values(model)
false

julia> optimize!(model)

julia> has_values(model)
true
in_set_string(mode::MIME, set)

Return a String representing the membership to the set set using print mode mode.

Extensions

JuMP extensions may extend this method for new set types to improve the legibility of their printing.

Example

julia> in_set_string(MIME("text/plain"), MOI.Interval(1.0, 2.0))
"∈ [1, 2]"
in_set_string(mode::MIME, constraint::AbstractConstraint)

Return a String representing the membership to the set of the constraint constraint using print mode mode.

index(cr::ConstraintRef)::MOI.ConstraintIndex

Return the index of the constraint that corresponds to cr in the MOI backend.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @constraint(model, c, x >= 0);

julia> index(c)
MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.GreaterThan{Float64}}(1)
index(v::GenericVariableRef)::MOI.VariableIndex

Return the index of the variable that corresponds to v in the MOI backend.

Example

julia> model = Model();

julia> @variable(model, x);

julia> index(x)
MOI.VariableIndex(1)
is_binary(v::GenericVariableRef)

Return true if v is constrained to be binary.

Example

julia> model = Model();

julia> @variable(model, x, Bin);

julia> is_binary(x)
true
is_fixed(v::GenericVariableRef)

Return true if v is a fixed variable. If true, the fixed value can be queried with fix_value.

See also FixRef, fix_value, fix, unfix.

Example

julia> model = Model();

julia> @variable(model, x);

julia> is_fixed(x)
false

julia> fix(x, 1.0)

julia> is_fixed(x)
true
is_integer(v::GenericVariableRef)

Return true if v is constrained to be integer.

Example

julia> model = Model();

julia> @variable(model, x);

julia> is_integer(x)
false

julia> set_integer(x)

julia> is_integer(x)
true
is_parameter(x::GenericVariableRef)::Bool

Return true if x is constrained to be a parameter.

Example

julia> model = Model();

julia> @variable(model, p in Parameter(2))
p

julia> is_parameter(p)
true

julia> @variable(model, x)
x

julia> is_parameter(x)
false
is_solved_and_feasible(
    model::GenericModel;
    allow_local::Bool = true,
    allow_almost::Bool = false,
    dual::Bool = false,
    result::Int = 1,
)

Return true if the model has a feasible primal solution associated with result index result and the termination_status is OPTIMAL (the solver found a global optimum) or LOCALLY_SOLVED (the solver found a local optimum, which may also be the global optimum, but the solver could not prove so).

If allow_local = false, then this function returns true only if the termination_status is OPTIMAL.

If allow_almost = true, then the termination_status may additionally be ALMOST_OPTIMAL or ALMOST_LOCALLY_SOLVED (if allow_local), and the primal_status and dual_status may additionally be NEARLY_FEASIBLE_POINT.

If dual, additionally check that an optimal dual solution is available.

If this function returns false, use termination_status, result_count, primal_status and dual_status to understand what solutions are available (if any).

Example

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> is_solved_and_feasible(model)
false
is_valid(model::GenericModel, con_ref::ConstraintRef{<:AbstractModel})

Return true if con_ref refers to a valid constraint in model.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @constraint(model, c, 2 * x <= 1);

julia> is_valid(model, c)
true

julia> model_2 = Model();

julia> is_valid(model_2, c)
false
is_valid(model::GenericModel, variable_ref::GenericVariableRef)

Return true if variable refers to a valid variable in model.

Example

julia> model = Model();

julia> @variable(model, x);

julia> is_valid(model, x)
true

julia> model_2 = Model();

julia> is_valid(model_2, x)
false
isequal_canonical(
    x::T,
    y::T
) where {T<:AbstractJuMPScalar,AbstractArray{<:AbstractJuMPScalar}}

Return true if x is equal to y after dropping zeros and disregarding the order.

This method is mainly useful for testing, because fallbacks like x == y do not account for valid mathematical comparisons like x[1] + 0 x[2] + 1 == x[1] + 1.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> a = x[1] + 1.0
x[1] + 1

julia> b = x[1] + x[2] + 1.0
x[1] + x[2] + 1

julia> add_to_expression!(b, -1.0, x[2])
x[1] + 0 x[2] + 1

julia> a == b
false

julia> isequal_canonical(a, b)
true
jump_function(model::AbstractModel, x::MOI.AbstractFunction)

Given an MathOptInterface object x, return the JuMP equivalent.

See also: moi_function.

Example

julia> model = Model();

julia> @variable(model, x);

julia> f = 2.0 * index(x) + 1.0
1.0 + 2.0 MOI.VariableIndex(1)

julia> jump_function(model, f)
2 x + 1
jump_function(constraint::AbstractConstraint)

Return the function of the constraint constraint in the function-in-set form as a AbstractJuMPScalar or Vector{AbstractJuMPScalar}.

jump_function_type(model::AbstractModel, ::Type{T}) where {T}

Given an MathOptInterface object type T, return the JuMP equivalent.

See also: moi_function_type.

Example

julia> model = Model();

julia> jump_function_type(model, MOI.ScalarAffineFunction{Float64})
AffExpr (alias for GenericAffExpr{Float64, GenericVariableRef{Float64}})
latex_formulation(model::AbstractModel)

Wrap model in a type so that it can be pretty-printed as text/latex in a notebook like IJulia, or in Documenter.

To render the model, end the cell with latex_formulation(model), or call display(latex_formulation(model)) in to force the display of the model from inside a function.

linear_terms(aff::GenericAffExpr{C,V})

Provides an iterator over coefficient-variable tuples (a_i::C, x_i::V) in the linear part of the affine expression.

linear_terms(quad::GenericQuadExpr{C,V})

Provides an iterator over tuples (coefficient::C, variable::V) in the linear part of the quadratic expression.

list_of_constraint_types(model::GenericModel)::Vector{Tuple{Type,Type}}

Return a list of tuples of the form (F, S) where F is a JuMP function type and S is an MOI set type such that all_constraints(model, F, S) returns a nonempty list.

Example

julia> model = Model();

julia> @variable(model, x >= 0, Bin);

julia> @constraint(model, 2x <= 1);

julia> list_of_constraint_types(model)
3-element Vector{Tuple{Type, Type}}:
 (AffExpr, MathOptInterface.LessThan{Float64})
 (VariableRef, MathOptInterface.GreaterThan{Float64})
 (VariableRef, MathOptInterface.ZeroOne)

Performance considerations

Iterating over the list of function and set types is a type-unstable operation. Consider using a function barrier. See the Performance tips for extensions section of the documentation for more details.

lower_bound(v::GenericVariableRef)

Return the lower bound of a variable. Error if one does not exist.

Example

julia> model = Model();

julia> @variable(model, x >= 1.0);

julia> lower_bound(x)
1.0
lp_matrix_data(model::GenericModel{T})

Given a JuMP model of a linear program, return an LPMatrixData{T} struct storing data for an equivalent linear program in the form:

where elements in x may be continuous, integer, or binary variables.

Fields

The struct returned by lp_matrix_data has the fields:

  • A::SparseArrays.SparseMatrixCSC{T,Int}: the constraint matrix in sparse matrix form.

  • b_lower::Vector{T}: the dense vector of row lower bounds. If missing, the value of typemin(T) is used.

  • b_upper::Vector{T}: the dense vector of row upper bounds. If missing, the value of typemax(T) is used.

  • x_lower::Vector{T}: the dense vector of variable lower bounds. If missing, the value of typemin(T) is used.

  • x_upper::Vector{T}: the dense vector of variable upper bounds. If missing, the value of typemax(T) is used.

  • c::Vector{T}: the dense vector of linear objective coefficients

  • c_offset::T: the constant term in the objective function.

  • sense::MOI.OptimizationSense: the objective sense of the model.

  • integers::Vector{Int}: the sorted list of column indices that are integer variables.

  • binaries::Vector{Int}: the sorted list of column indices that are binary variables.

  • variables::Vector{GenericVariableRef{T}}: a vector of GenericVariableRef, corresponding to order of the columns in the matrix form.

  • affine_constraints::Vector{ConstraintRef}: a vector of ConstraintRef, corresponding to the order of rows in the matrix form.

Limitations

The models supported by lp_matrix_data are intentionally limited to linear programs.

Example

julia> model = Model();

julia> @variable(model, x[1:2] >= 0);

julia> @constraint(model, x[1] + 2 * x[2] <= 1);

julia> @objective(model, Max, x[2]);

julia> data = lp_matrix_data(model);

julia> data.A
1×2 SparseArrays.SparseMatrixCSC{Float64, Int64} with 2 stored entries:
 1.0  2.0

julia> data.b_lower
1-element Vector{Float64}:
 -Inf

julia> data.b_upper
1-element Vector{Float64}:
 1.0

julia> data.x_lower
2-element Vector{Float64}:
 0.0
 0.0

julia> data.x_upper
2-element Vector{Float64}:
 Inf
 Inf

julia> data.c
2-element Vector{Float64}:
 0.0
 1.0

julia> data.c_offset
0.0

julia> data.sense
MAX_SENSE::OptimizationSense = 1
lp_sensitivity_report(model::GenericModel{T}; atol::T = Base.rtoldefault(T))::SensitivityReport{T} where {T}

Given a linear program model with a current optimal basis, return a SensitivityReport object, which maps:

  • Every variable reference to a tuple (d_lo, d_hi)::Tuple{T,T}, explaining how much the objective coefficient of the corresponding variable can change by, such that the original basis remains optimal.

  • Every constraint reference to a tuple (d_lo, d_hi)::Tuple{T,T}, explaining how much the right-hand side of the corresponding constraint can change by, such that the basis remains optimal.

Both tuples are relative, rather than absolute. So given a objective coefficient of 1.0 and a tuple (-0.5, 0.5), the objective coefficient can range between 1.0 - 0.5 an 1.0 + 0.5.

atol is the primal/dual optimality tolerance, and should match the tolerance of the solver used to compute the basis.

interval constraints are NOT supported.

Example

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_silent(model)

julia> @variable(model, -1 <= x <= 2)
x

julia> @objective(model, Min, x)
x

julia> optimize!(model)

julia> report = lp_sensitivity_report(model; atol = 1e-7);

julia> dx_lo, dx_hi = report[x]
(-1.0, Inf)

julia> println(
           "The objective coefficient of `x` can decrease by $dx_lo or " *
           "increase by $dx_hi."
       )
The objective coefficient of `x` can decrease by -1.0 or increase by Inf.

julia> dRHS_lo, dRHS_hi = report[LowerBoundRef(x)]
(-Inf, 3.0)

julia> println(
           "The lower bound of `x` can decrease by $dRHS_lo or increase " *
           "by $dRHS_hi."
       )
The lower bound of `x` can decrease by -Inf or increase by 3.0.
map_coefficients(f::Function, a::GenericAffExpr)

Apply f to the coefficients and constant term of an GenericAffExpr a and return a new expression.

Example

julia> model = Model();

julia> @variable(model, x);

julia> a = GenericAffExpr(1.0, x => 1.0)
x + 1

julia> map_coefficients(c -> 2 * c, a)
2 x + 2

julia> a
x + 1
map_coefficients(f::Function, a::GenericQuadExpr)

Apply f to the coefficients and constant term of an GenericQuadExpr a and return a new expression.

Example

julia> model = Model();

julia> @variable(model, x);

julia> a = @expression(model, x^2 + x + 1)
x² + x + 1

julia> map_coefficients(c -> 2 * c, a)
2 x² + 2 x + 2

julia> a
x² + x + 1
map_coefficients_inplace!(f::Function, a::GenericAffExpr)

Apply f to the coefficients and constant term of an GenericAffExpr a and update them in-place.

See also: map_coefficients

Example

julia> model = Model();

julia> @variable(model, x);

julia> a = GenericAffExpr(1.0, x => 1.0)
x + 1

julia> map_coefficients_inplace!(c -> 2 * c, a)
2 x + 2

julia> a
2 x + 2
map_coefficients_inplace!(f::Function, a::GenericQuadExpr)

Apply f to the coefficients and constant term of an GenericQuadExpr a and update them in-place.

See also: map_coefficients

Example

julia> model = Model();

julia> @variable(model, x);

julia> a = @expression(model, x^2 + x + 1)
x² + x + 1

julia> map_coefficients_inplace!(c -> 2 * c, a)
2 x² + 2 x + 2

julia> a
2 x² + 2 x + 2
mode(model::GenericModel)

Return the ModelMode of model.

Example

julia> model = Model();

julia> mode(model)
AUTOMATIC::ModelMode = 0
model_convert(
    model::AbstractModel,
    rhs::Union{
        AbstractConstraint,
        Number,
        AbstractJuMPScalar,
        MOI.AbstractSet,
    },
)

Convert the coefficients and constants of functions and sets in the rhs to the coefficient type value_type(typeof(model)).

Purpose

Creating and adding a constraint is a two-step process. The first step calls build_constraint, and the result of that is passed to add_constraint.

However, because build_constraint does not take the model as an argument, the coefficients and constants of the function or set might be different than value_type(typeof(model)).

Therefore, the result of build_constraint is converted in a call to model_convert before the result is passed to add_constraint.

model_string(mode::MIME, model::AbstractModel)

Return a String representation of model given the mode.

Example

julia> model = Model();

julia> @variable(model, x >= 0);

julia> print(model_string(MIME("text/plain"), model))
Feasibility
Subject to
 x ≥ 0
moi_function(x::AbstractJuMPScalar)
moi_function(x::AbstractArray{<:AbstractJuMPScalar})

Given a JuMP object x, return the MathOptInterface equivalent.

See also: jump_function.

Example

julia> model = Model();

julia> @variable(model, x);

julia> f = 2.0 * x + 1.0
2 x + 1

julia> moi_function(f)
1.0 + 2.0 MOI.VariableIndex(1)
moi_function(constraint::AbstractConstraint)

Return the function of the constraint constraint in the function-in-set form as a MathOptInterface.AbstractFunction.

moi_function_type(::Type{T}) where {T}

Given a JuMP object type T, return the MathOptInterface equivalent.

See also: jump_function_type.

Example

julia> moi_function_type(AffExpr)
MathOptInterface.ScalarAffineFunction{Float64}
moi_set(constraint::AbstractConstraint)

Return the set of the constraint constraint in the function-in-set form as a MathOptInterface.AbstractSet.

moi_set(s::AbstractVectorSet, dim::Int)

Returns the MOI set of dimension dim corresponding to the JuMP set s.

moi_set(s::AbstractScalarSet)

Returns the MOI set corresponding to the JuMP set s.

name(model::AbstractModel)

Return the MOI.Name attribute of model's backend, or a default if empty.

Example

julia> model = Model();

julia> name(model)
"A JuMP Model"
name(v::GenericVariableRef)::String

Get a variable’s name attribute.

Example

julia> model = Model();

julia> @variable(model, x[1:2])
2-element Vector{VariableRef}:
 x[1]
 x[2]

julia> name(x[1])
"x[1]"
name(con_ref::ConstraintRef)

Get a constraint’s name attribute.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @constraint(model, c, [2x] in Nonnegatives())
c : [2 x] ∈ Nonnegatives()

julia> name(c)
"c"
node_count(model::GenericModel)

If available, returns the total number of branch-and-bound nodes explored during the most recent optimization in a Mixed Integer Program (the MOI.NodeCount attribute).

Throws a MOI.GetAttributeNotAllowed error if the attribute is not implemented by the solver.

Example

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_silent(model)

julia> optimize!(model)

julia> node_count(model)
0
nonlinear_constraint_string(
    model::GenericModel,
    mode::MIME,
    c::_NonlinearConstraint,
)

Return a string representation of the nonlinear constraint c belonging to model, given the mode.

Совместимость

This function is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling.

nonlinear_dual_start_value(model::Model)

Return the current value of the MOI attribute MOI.NLPBlockDualStart.

Совместимость

This function is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling.

nonlinear_expr_string(
    model::GenericModel,
    mode::MIME,
    c::MOI.Nonlinear.Expression,
)

Return a string representation of the nonlinear expression c belonging to model, given the mode.

Совместимость

This function is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling.

nonlinear_model(
    model::GenericModel;
    force::Bool = false,
)::Union{MOI.Nonlinear.Model,Nothing}

If model has nonlinear components, return a MOI.Nonlinear.Model, otherwise return nothing.

If force, always return a MOI.Nonlinear.Model, and if one does not exist for the model, create an empty one.

Совместимость

This function is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling.

normalized_coefficient(
    constraint::ConstraintRef,
    variable_1::GenericVariableRef,
    variable_2::GenericVariableRef,
)

Return the quadratic coefficient associated with variable_1 and variable_2 in constraint after JuMP has normalized the constraint into its standard form.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> @constraint(model, con, 2x[1]^2 + 3 * x[1] * x[2] + x[2] <= 2)
con : 2 x[1]² + 3 x[1]*x[2] + x[2] ≤ 2

julia> normalized_coefficient(con, x[1], x[1])
2.0

julia> normalized_coefficient(con, x[1], x[2])
3.0

julia> @constraint(model, con_vec, x.^2 <= [1, 2])
con_vec : [x[1]² - 1, x[2]² - 2] ∈ Nonpositives()

julia> normalized_coefficient(con_vec, x[1], x[1])
1-element Vector{Tuple{Int64, Float64}}:
 (1, 1.0)

julia> normalized_coefficient(con_vec, x[1], x[2])
Tuple{Int64, Float64}[]
normalized_coefficient(
    constraint::ConstraintRef,
    variable::GenericVariableRef,
)

Return the coefficient associated with variable in constraint after JuMP has normalized the constraint into its standard form.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> @constraint(model, con, 2x + 3x <= 2)
con : 5 x ≤ 2

julia> normalized_coefficient(con, x)
5.0

julia> @constraint(model, con_vec, [x, 2x + 1, 3] >= 0)
con_vec : [x, 2 x + 1, 3] ∈ Nonnegatives()

julia> normalized_coefficient(con_vec, x)
2-element Vector{Tuple{Int64, Float64}}:
 (1, 1.0)
 (2, 2.0)
normalized_rhs(constraint::ConstraintRef)

Return the right-hand side term of constraint after JuMP has converted the constraint into its normalized form.

See also set_normalized_rhs.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @constraint(model, con, 2x + 1 <= 2)
con : 2 x ≤ 1

julia> normalized_rhs(con)
1.0
num_constraints(model::GenericModel, function_type, set_type)::Int64

Return the number of constraints currently in the model where the function has type function_type and the set has type set_type.

Example

julia> model = Model();

julia> @variable(model, x >= 0, Bin);

julia> @variable(model, y);

julia> @constraint(model, y in MOI.GreaterThan(1.0));

julia> @constraint(model, y <= 1.0);

julia> @constraint(model, 2x <= 1);

julia> num_constraints(model, VariableRef, MOI.GreaterThan{Float64})
2

julia> num_constraints(model, VariableRef, MOI.ZeroOne)
1

julia> num_constraints(model, AffExpr, MOI.LessThan{Float64})
2
num_constraints(model::GenericModel; count_variable_in_set_constraints::Bool)

Return the number of constraints in model.

If count_variable_in_set_constraints == true, then VariableRef constraints such as VariableRef-in-Integer are included. To count only the number of structural constraints (for example, the rows in the constraint matrix of a linear program), pass count_variable_in_set_constraints = false.

Example

julia> model = Model();

julia> @variable(model, x >= 0, Int);

julia> @constraint(model, 2x <= 1);

julia> num_constraints(model; count_variable_in_set_constraints = true)
3

julia> num_constraints(model; count_variable_in_set_constraints = false)
1
num_nonlinear_constraints(model::GenericModel)

Returns the number of nonlinear constraints associated with the model.

Совместимость

This function is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling.

This function counts only the constraints added with @NLconstraint and add_nonlinear_constraint. It does not count GenericNonlinearExpr constraints.

num_variables(model::GenericModel)::Int64

Returns number of variables in model.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> num_variables(model)
2
object_dictionary(model::GenericModel)

Return the dictionary that maps the symbol name of a variable, constraint, or expression to the corresponding object.

Objects are registered to a specific symbol in the macros. For example, @variable(model, x[1:2, 1:2]) registers the array of variables x to the symbol :x.

This method should be defined for any subtype of AbstractModel.

See also: unregister.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> object_dictionary(model)
Dict{Symbol, Any} with 1 entry:
  :x => VariableRef[x[1], x[2]]
objective_bound(model::GenericModel)

Return the best known bound on the optimal objective value after a call to optimize!(model).

For scalar-valued objectives, this function returns a Float64. For vector-valued objectives, it returns a Vector{Float64}.

In the case of a vector-valued objective, this returns the ideal point, that is, the point obtained if each objective was optimized independently.

This function is equivalent to querying the MOI.ObjectiveBound attribute.

Example

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_silent(model)

julia> @variable(model, x >= 1, Int);

julia> @objective(model, Min, 2 * x + 1);

julia> optimize!(model)

julia> objective_bound(model)
3.0
objective_function(
    model::GenericModel,
    ::Type{F} = objective_function_type(model),
) where {F}

Return an object of type F representing the objective function.

Errors if the objective is not convertible to type F.

This function is equivalent to querying the MOI.ObjectiveFunction{F} attribute.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> @objective(model, Min, 2x + 1)
2 x + 1

julia> objective_function(model, AffExpr)
2 x + 1

julia> objective_function(model, QuadExpr)
2 x + 1

julia> typeof(objective_function(model, QuadExpr))
QuadExpr (alias for GenericQuadExpr{Float64, GenericVariableRef{Float64}})

We see with the last two commands that even if the objective function is affine, as it is convertible to a quadratic function, it can be queried as a quadratic function and the result is quadratic.

However, it is not convertible to a variable:

julia> objective_function(model, VariableRef)
ERROR: InexactError: convert(MathOptInterface.VariableIndex, 1.0 + 2.0 MOI.VariableIndex(1))
[...]
objective_function_string(mode, model::AbstractModel)::String

Return a String describing the objective function of the model.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @objective(model, Min, 2 * x);

julia> objective_function_string(MIME("text/plain"), model)
"2 x"
objective_function_type(model::GenericModel)::AbstractJuMPScalar

Return the type of the objective function.

This function is equivalent to querying the MOI.ObjectiveFunctionType attribute.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @objective(model, Min, 2 * x + 1);

julia> objective_function_type(model)
AffExpr (alias for GenericAffExpr{Float64, GenericVariableRef{Float64}})
objective_sense(model::GenericModel)::MOI.OptimizationSense

Return the objective sense.

This function is equivalent to querying the MOI.ObjectiveSense attribute.

Example

julia> model = Model();

julia> objective_sense(model)
FEASIBILITY_SENSE::OptimizationSense = 2

julia> @variable(model, x);

julia> @objective(model, Max, x)
x

julia> objective_sense(model)
MAX_SENSE::OptimizationSense = 1
objective_value(model::GenericModel; result::Int = 1)

Return the objective value associated with result index result of the most-recent solution returned by the solver.

For scalar-valued objectives, this function returns a Float64. For vector-valued objectives, it returns a Vector{Float64}.

This function is equivalent to querying the MOI.ObjectiveValue attribute.

See also: result_count.

Example

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_silent(model)

julia> @variable(model, x >= 1);

julia> @objective(model, Min, 2 * x + 1);

julia> optimize!(model)

julia> objective_value(model)
3.0

julia> objective_value(model; result = 2)
ERROR: Result index of attribute MathOptInterface.ObjectiveValue(2) out of bounds. There are currently 1 solution(s) in the model.
Stacktrace:
[...]
op_ifelse(a, x, y)

A function that falls back to ifelse(a, x, y), but when called with a JuMP variables or expression in the first argument, returns a GenericNonlinearExpr.

Example

julia> model = Model();

julia> @variable(model, x);

julia> op_ifelse(true, 1.0, 2.0)
1.0

julia> op_ifelse(x, 1.0, 2.0)
ifelse(x, 1.0, 2.0)

julia> op_ifelse(true, x, 2.0)
x
op_string(mime::MIME, x::GenericNonlinearExpr, ::Val{op}) where {op}

Return the string that should be printed for the operator op when function_string is called with mime and x.

Example

julia> model = Model();

julia> @variable(model, x[1:2], Bin);

julia> f = @expression(model, x[1] || x[2]);

julia> op_string(MIME("text/plain"), f, Val(:||))
"||"
operator_to_set(error_fn::Function, ::Val{sense_symbol})

Converts a sense symbol to a set set such that @constraint(model, func sense_symbol 0) is equivalent to @constraint(model, func in set) for any func::AbstractJuMPScalar.

Example

Once a custom set is defined you can directly create a JuMP constraint with it:

julia> struct CustomSet{T} <: MOI.AbstractScalarSet
           value::T
       end

julia> Base.copy(x::CustomSet) = CustomSet(x.value)

julia> model = Model();

julia> @variable(model, x)
x

julia> cref = @constraint(model, x in CustomSet(1.0))
x ∈ CustomSet{Float64}(1.0)

However, there might be an appropriate sign that could be used in order to provide a more convenient syntax:

julia> JuMP.operator_to_set(::Function, ::Val{:⊰}) = CustomSet(0.0)

julia> MOIU.supports_shift_constant(::Type{<:CustomSet}) = true

julia> MOIU.shift_constant(set::CustomSet, value) = CustomSet(set.value + value)

julia> cref = @constraint(model, x ⊰ 1)
x ∈ CustomSet{Float64}(1.0)

Note that the whole function is first moved to the right-hand side, then the sign is transformed into a set with zero constant and finally the constant is moved to the set with MOIU.shift_constant.

operator_warn(model::AbstractModel)
operator_warn(model::GenericModel)

This function is called on the model whenever two affine expressions are added together without using destructive_add!, and at least one of the two expressions has more than 50 terms.

For the case of Model, if this function is called more than 20,000 times then a warning is generated once.

This method should only be implemented by developers creating JuMP extensions. It should never be called by users of JuMP.

optimize!(
    model::GenericModel;
    ignore_optimize_hook = (model.optimize_hook === nothing),
    kwargs...,
)

Optimize the model.

If an optimizer has not been set yet (see set_optimizer), a NoOptimizer error is thrown.

If ignore_optimize_hook == true, the optimize hook is ignored and the model is solved as if the hook was not set. Keyword arguments kwargs are passed to the optimize_hook. An error is thrown if optimize_hook is nothing and keyword arguments are provided.

Example

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_silent(model)

julia> function my_optimize_hook(model; foo)
           println("Hook called with foo = ", foo)
           return optimize!(model; ignore_optimize_hook = true)
       end
my_optimize_hook (generic function with 1 method)

julia> set_optimize_hook(model, my_optimize_hook)
my_optimize_hook (generic function with 1 method)

julia> optimize!(model; foo = 2)
Hook called with foo = 2
optimizer_index(x::GenericVariableRef)::MOI.VariableIndex
optimizer_index(x::ConstraintRef{<:GenericModel})::MOI.ConstraintIndex

Return the variable or constraint index that corresponds to x in the associated model unsafe_backend(owner_model(x)).

This function should be used with unsafe_backend.

As a safer alternative, use backend and index. See the docstrings of backend and unsafe_backend for more details.

Throws

  • Throws NoOptimizer if no optimizer is set.

  • Throws an ErrorException if the optimizer is set but is not attached.

  • Throws an ErrorException if the index is bridged.

Example

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_silent(model)

julia> @variable(model, x >= 0)
x

julia> MOI.Utilities.attach_optimizer(model)

julia> highs = unsafe_backend(model)
A HiGHS model with 1 columns and 0 rows.

julia> optimizer_index(x)
MOI.VariableIndex(1)
optimizer_with_attributes(optimizer_constructor, attrs::Pair...)

Groups an optimizer constructor with the list of attributes attrs. Note that it is equivalent to MOI.OptimizerWithAttributes.

When provided to the Model constructor or to set_optimizer, it creates an optimizer by calling optimizer_constructor(), and then sets the attributes using set_attribute.

Note

The string names of the attributes are specific to each solver. One should consult the solver’s documentation to find the attributes of interest.

Example

julia> import HiGHS

julia> optimizer = optimizer_with_attributes(
           HiGHS.Optimizer, "presolve" => "off", MOI.Silent() => true,
       );

julia> model = Model(optimizer);

is equivalent to:

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_attribute(model, "presolve", "off")

julia> set_attribute(model, MOI.Silent(), true)
owner_model(s::AbstractJuMPScalar)

Return the model owning the scalar s.

Example

julia> model = Model();

julia> @variable(model, x);

julia> owner_model(x) === model
true
owner_model(v::AbstractVariableRef)

Returns the model to which v belongs.

Example

julia> model = Model();

julia> x = @variable(model)
_[1]

julia> owner_model(x) === model
true
owner_model(con_ref::ConstraintRef)

Returns the model to which con_ref belongs.

parameter_value(x::GenericVariableRef)

Return the value of the parameter x.

Errors if x is not a parameter.

Example

julia> model = Model();

julia> @variable(model, p in Parameter(2))
p

julia> parameter_value(p)
2.0

julia> set_parameter_value(p, 2.5)

julia> parameter_value(p)
2.5
parse_constraint(error_fn::Function, expr::Expr)

The entry-point for all constraint-related parsing.

Arguments

  • The error_fn function is passed everywhere to provide better error messages

  • expr comes from the @constraint macro. There are two possibilities:

    • @constraint(model, expr)

    • @constraint(model, name[args], expr)

    In both cases, expr is the main component of the constraint.

Supported syntax

JuMP currently supports the following expr objects:

  • lhs <= rhs

  • lhs == rhs

  • lhs >= rhs

  • l <= body <= u

  • u >= body >= l

  • lhs ⟂ rhs

  • lhs in rhs

  • lhs ∈ rhs

  • z --> {constraint}

  • !z --> {constraint}

  • z <--> {constraint}

  • !z <--> {constraint}

  • z => {constraint}

  • !z => {constraint}

as well as all broadcasted variants.

Extensions

The infrastructure behind parse_constraint is extendable. See parse_constraint_head and parse_constraint_call for details.

parse_constraint_call(
    error_fn::Function,
    vectorized::Bool,
    ::Val{op},
    lhs,
    rhs,
) where {op}

Fallback handler for binary operators. These might be infix operators like @constraint(model, lhs op rhs), or normal operators like @constraint(model, op(lhs, rhs)).

In both cases, we rewrite as lhs - rhs in operator_to_set(error_fn, op).

See operator_to_set for details.

parse_constraint_call(
    error_fn::Function,
    is_vectorized::Bool,
    ::Val{op},
    args...,
)

Implement this method to intercept the parsing of a :call expression with operator op.

Extending the constraint macro at parse time is an advanced operation and has the potential to interfere with existing JuMP syntax. Please discuss with the developer chatroom before publishing any code that implements these methods.

Arguments

  • error_fn: a function that accepts a String and throws the string as an error, along with some descriptive information of the macro from which it was thrown.

  • is_vectorized: a boolean to indicate if op should be broadcast or not

  • op: the first element of the .args field of the Expr to intercept

  • args...: the .args field of the Expr.

Returns

This function must return:

  • parse_code::Expr: an expression containing any setup or rewriting code that needs to be called before build_constraint

  • build_code::Expr: an expression that calls build_constraint( or build_constraint.( depending on is_vectorized.

parse_constraint_head(error_fn::Function, ::Val{head}, args...)

Implement this method to intercept the parsing of an expression with head head.

Extending the constraint macro at parse time is an advanced operation and has the potential to interfere with existing JuMP syntax. Please discuss with the developer chatroom before publishing any code that implements these methods.

Arguments

  • error_fn: a function that accepts a String and throws the string as an error, along with some descriptive information of the macro from which it was thrown.

  • head: the .head field of the Expr to intercept

  • args...: the .args field of the Expr.

Returns

This function must return:

  • is_vectorized::Bool: whether the expression represents a broadcasted expression like x .<= 1

  • parse_code::Expr: an expression containing any setup or rewriting code that needs to be called before build_constraint

  • build_code::Expr: an expression that calls build_constraint( or build_constraint.( depending on is_vectorized.

Existing implementations

JuMP currently implements:

  • ::Val{:call}, which forwards calls to parse_constraint_call

  • ::Val{:comparison}, which handles the special case of l <= body <= u.

parse_one_operator_variable(
    error_fn::Function,
    info_expr::_VariableInfoExpr,
    sense::Val{S},
    value,
) where {S}

Update infoexr for a variable expression in the @variable macro of the form variable name S value.

parse_ternary_variable(error_fn, info_expr, lhs_sense, lhs, rhs_sense, rhs)

A hook for JuMP extensions to intercept the parsing of a :comparison expression, which has the form lhs lhs_sense variable rhs_sense rhs.

parse_variable(error_fn::Function, ::_VariableInfoExpr, args...)

A hook for extensions to intercept the parsing of inequality constraints in the @variable macro.

primal_feasibility_report(
    model::GenericModel{T},
    point::AbstractDict{GenericVariableRef{T},T} = _last_primal_solution(model),
    atol::T = zero(T),
    skip_missing::Bool = false,
)::Dict{Any,T}

Given a dictionary point, which maps variables to primal values, return a dictionary whose keys are the constraints with an infeasibility greater than the supplied tolerance atol. The value corresponding to each key is the respective infeasibility. Infeasibility is defined as the distance between the primal value of the constraint (see MOI.ConstraintPrimal) and the nearest point by Euclidean distance in the corresponding set.

Notes

  • If skip_missing = true, constraints containing variables that are not in point will be ignored.

  • If skip_missing = false and a partial primal solution is provided, an error will be thrown.

  • If no point is provided, the primal solution from the last time the model was solved is used.

Example

julia> model = Model();

julia> @variable(model, 0.5 <= x <= 1);

julia> primal_feasibility_report(model, Dict(x => 0.2))
Dict{Any, Float64} with 1 entry:
  x ≥ 0.5 => 0.3
primal_feasibility_report(
    point::Function,
    model::GenericModel{T};
    atol::T = zero(T),
    skip_missing::Bool = false,
) where {T}

A form of primal_feasibility_report where a function is passed as the first argument instead of a dictionary as the second argument.

Example

julia> model = Model();

julia> @variable(model, 0.5 <= x <= 1, start = 1.3);

julia> primal_feasibility_report(model) do v
           return start_value(v)
       end
Dict{Any, Float64} with 1 entry:
  x ≤ 1 => 0.3
primal_status(model::GenericModel; result::Int = 1)

Return a MOI.ResultStatusCode describing the status of the most recent primal solution of the solver (that is, the MOI.PrimalStatus attribute) associated with the result index result.

See also: result_count.

Example

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> primal_status(model; result = 2)
NO_SOLUTION::ResultStatusCode = 0
print_active_bridges(
    [io::IO = stdout,]
    model::GenericModel,
    F::Type,
    S::Type{<:MOI.AbstractSet},
)

Print a list of bridges required for a constraint of type F-in-S.

print_active_bridges(
    [io::IO = stdout,]
    model::GenericModel,
    S::Type{<:MOI.AbstractSet},
)

Print a list of bridges required to add a variable constrained to the set S.

print_active_bridges([io::IO = stdout,] model::GenericModel)

Print a list of the variable, constraint, and objective bridges that are currently used in the model.

print_active_bridges([io::IO = stdout,] model::GenericModel, ::Type{F}) where {F}

Print a list of bridges required for an objective function of type F.

 print_bridge_graph([io::IO,] model::GenericModel)

Print the hyper-graph containing all variable, constraint, and objective types that could be obtained by bridging the variables, constraints, and objectives that are present in the model.

This function is intended for advanced users. If you want to see only the bridges that are currently used, use print_active_bridges instead.

Explanation of output

Each node in the hyper-graph corresponds to a variable, constraint, or objective type.

  • Variable nodes are indicated by [ ]

  • Constraint nodes are indicated by ( )

  • Objective nodes are indicated by | |

The number inside each pair of brackets is an index of the node in the hyper-graph.

Note that this hyper-graph is the full list of possible transformations. When the bridged model is created, we select the shortest hyper-path(s) from this graph, so many nodes may be un-used.

For more information, see Legat, B., Dowson, O., Garcia, J., and Lubin, M. (2020). "MathOptInterface: a data structure for mathematical optimization problems." URL: https://arxiv.org/abs/2002.03447

quad_terms(quad::GenericQuadExpr{C,V})

Provides an iterator over tuples (coefficient::C, var_1::V, var_2::V) in the quadratic part of the quadratic expression.

raw_status(model::GenericModel)

Return the reason why the solver stopped in its own words (that is, the MathOptInterface model attribute MOI.RawStatusString).

Example

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> raw_status(model)
"optimize not called"
read_from_file(
    filename::String;
    format::MOI.FileFormats.FileFormat = MOI.FileFormats.FORMAT_AUTOMATIC,
    kwargs...,
)

Return a JuMP model read from filename in the format format.

If the filename ends in .gz, it will be uncompressed using GZip. If the filename ends in .bz2, it will be uncompressed using BZip2.

Other kwargs are passed to the Model constructor of the chosen format.

reduced_cost(x::GenericVariableRef{T})::T where {T}

Return the reduced cost associated with variable x.

One interpretation of the reduced cost is that it is the change in the objective from an infinitesimal relaxation of the variable bounds.

This method is equivalent to querying the shadow price of the active variable bound (if one exists and is active).

See also: shadow_price.

Example

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_silent(model)

julia> @variable(model, x <= 1);

julia> @objective(model, Max, 2 * x + 1);

julia> optimize!(model)

julia> has_duals(model)
true

julia> reduced_cost(x)
2.0
register(
    model::Model,
    s::Symbol,
    dimension::Integer,
    f::Function,
    ∇f::Function,
    ∇²f::Function,
)

Register the user-defined function f that takes dimension arguments in model as the symbol s. In addition, provide a gradient function ∇f and a hessian function ∇²f.

∇f and ∇²f must return numbers corresponding to the first- and second-order derivatives of the function f respectively.

Совместимость

This function is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling.

Notes

  • Because automatic differentiation is not used, you can assume the inputs are all Float64.

  • This method will throw an error if dimension > 1.

  • s does not have to be the same symbol as f, but it is generally more readable if it is.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> f(x::Float64) = x^2
f (generic function with 1 method)

julia> ∇f(x::Float64) = 2 * x
∇f (generic function with 1 method)

julia> ∇²f(x::Float64) = 2.0
∇²f (generic function with 1 method)

julia> register(model, :foo, 1, f, ∇f, ∇²f)

julia> @NLobjective(model, Min, foo(x))
register(
    model::Model,
    s::Symbol,
    dimension::Integer,
    f::Function,
    ∇f::Function;
    autodiff:Bool = false,
)

Register the user-defined function f that takes dimension arguments in model as the symbol s. In addition, provide a gradient function ∇f.

The functions fand ∇f must support all subtypes of Real as arguments. Do not assume that the inputs are Float64.

Совместимость

This function is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling.

Notes

  • If the function f is univariate (that is, dimension == 1), ∇f must return a number which represents the first-order derivative of the function f.

  • If the function f is multi-variate, ∇f must have a signature matching ∇f(g::AbstractVector{T}, args::T...) where {T<:Real}, where the first argument is a vector g that is modified in-place with the gradient.

  • If autodiff = true and dimension == 1, use automatic differentiation to compute the second-order derivative information. If autodiff = false, only first-order derivative information will be used.

  • s does not have to be the same symbol as f, but it is generally more readable if it is.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> f(x::T) where {T<:Real} = x^2
f (generic function with 1 method)

julia> ∇f(x::T) where {T<:Real} = 2 * x
∇f (generic function with 1 method)

julia> register(model, :foo, 1, f, ∇f; autodiff = true)

julia> @NLobjective(model, Min, foo(x))
julia> model = Model();

julia> @variable(model, x[1:2])
2-element Vector{VariableRef}:
 x[1]
 x[2]

julia> g(x::T, y::T) where {T<:Real} = x * y
g (generic function with 1 method)

julia> function ∇g(g::AbstractVector{T}, x::T, y::T) where {T<:Real}
           g[1] = y
           g[2] = x
           return
       end
∇g (generic function with 1 method)

julia> register(model, :g, 2, g, ∇g)

julia> @NLobjective(model, Min, g(x[1], x[2]))
register(
    model::Model,
    op::Symbol,
    dimension::Integer,
    f::Function;
    autodiff:Bool = false,
)

Register the user-defined function f that takes dimension arguments in model as the symbol op.

The function f must support all subtypes of Real as arguments. Do not assume that the inputs are Float64.

Совместимость

This function is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling.

Notes

  • For this method, you must explicitly set autodiff = true, because no user-provided gradient function ∇f is given.

  • Second-derivative information is only computed if dimension == 1.

  • op does not have to be the same symbol as f, but it is generally more readable if it is.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> f(x::T) where {T<:Real} = x^2
f (generic function with 1 method)

julia> register(model, :foo, 1, f; autodiff = true)

julia> @NLobjective(model, Min, foo(x))
julia> model = Model();

julia> @variable(model, x[1:2])
2-element Vector{VariableRef}:
 x[1]
 x[2]

julia> g(x::T, y::T) where {T<:Real} = x * y
g (generic function with 1 method)

julia> register(model, :g, 2, g; autodiff = true)

julia> @NLobjective(model, Min, g(x[1], x[2]))
relative_gap(model::GenericModel)

Return the final relative optimality gap after a call to optimize!(model).

Exact value depends upon implementation of MOI.RelativeGap by the particular solver used for optimization.

This function is equivalent to querying the MOI.RelativeGap attribute.

Example

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_silent(model)

julia> @variable(model, x >= 1, Int);

julia> @objective(model, Min, 2 * x + 1);

julia> optimize!(model)

julia> relative_gap(model)
0.0
relax_integrality(model::GenericModel)

Modifies model to "relax" all binary and integrality constraints on variables. Specifically,

  • Binary constraints are deleted, and variable bounds are tightened if necessary to ensure the variable is constrained to the interval ].

  • Integrality constraints are deleted without modifying variable bounds.

  • An error is thrown if semi-continuous or semi-integer constraints are present (support may be added for these in the future).

  • All other constraints are ignored (left in place). This includes discrete constraints like SOS and indicator constraints.

Returns a function that can be called without any arguments to restore the original model. The behavior of this function is undefined if additional changes are made to the affected variables in the meantime.

Example

julia> model = Model();

julia> @variable(model, x, Bin);

julia> @variable(model, 1 <= y <= 10, Int);

julia> @objective(model, Min, x + y);

julia> undo_relax = relax_integrality(model);

julia> print(model)
Min x + y
Subject to
 x ≥ 0
 y ≥ 1
 x ≤ 1
 y ≤ 10

julia> undo_relax()

julia> print(model)
Min x + y
Subject to
 y ≥ 1
 y ≤ 10
 y integer
 x binary
relax_with_penalty!(
    model::GenericModel{T},
    [penalties::Dict{ConstraintRef,T}];
    [default::Union{Nothing,Real} = nothing,]
) where {T}

Destructively modify the model in-place to create a penalized relaxation of the constraints.

This is a destructive routine that modifies the model in-place. If you don’t want to modify the original model, use copy_model to create a copy before calling relax_with_penalty!.

Reformulation

See MOI.Utilities.ScalarPenaltyRelaxation for details of the reformulation.

For each constraint ci, the penalty passed to MOI.Utilities.ScalarPenaltyRelaxation is get(penalties, ci, default). If the value is nothing, because ci does not exist in penalties and default = nothing, then the constraint is skipped.

Return value

This function returns a Dict{ConstraintRef,AffExpr} that maps each constraint index to the corresponding y + z as an AffExpr. In an optimal solution, query the value of these functions to compute the violation of each constraint.

Relax a subset of constraints

To relax a subset of constraints, pass a penalties dictionary and set default = nothing.

Example

julia> function new_model()
           model = Model()
           @variable(model, x)
           @objective(model, Max, 2x + 1)
           @constraint(model, c1, 2x - 1 <= -2)
           @constraint(model, c2, 3x >= 0)
           return model
       end
new_model (generic function with 1 method)

julia> model_1 = new_model();

julia> penalty_map = relax_with_penalty!(model_1; default = 2.0);

julia> penalty_map[model_1[:c1]]
_[3]

julia> penalty_map[model_1[:c2]]
_[2]

julia> print(model_1)
Max 2 x - 2 _[2] - 2 _[3] + 1
Subject to
 c2 : 3 x + _[2] ≥ 0
 c1 : 2 x - _[3] ≤ -1
 _[2] ≥ 0
 _[3] ≥ 0

julia> model_2 = new_model();

julia> relax_with_penalty!(model_2, Dict(model_2[:c2] => 3.0))
Dict{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.GreaterThan{Float64}}, ScalarShape}, AffExpr} with 1 entry:
  c2 : 3 x + _[2] ≥ 0 => _[2]

julia> print(model_2)
Max 2 x - 3 _[2] + 1
Subject to
 c2 : 3 x + _[2] ≥ 0
 c1 : 2 x ≤ -1
 _[2] ≥ 0
remove_bridge(
    model::GenericModel{S},
    BT::Type{<:MOI.Bridges.AbstractBridge};
    coefficient_type::Type{T} = S,
) where {S,T}

Remove BT{T} from the list of bridges that can be used to transform unsupported constraints into an equivalent formulation using only constraints supported by the optimizer.

See also: add_bridge.

Example

julia> model = Model();

julia> add_bridge(model, MOI.Bridges.Constraint.SOCtoNonConvexQuadBridge)

julia> remove_bridge(model, MOI.Bridges.Constraint.SOCtoNonConvexQuadBridge)

julia> add_bridge(
           model,
           MOI.Bridges.Constraint.NumberConversionBridge;
           coefficient_type = Complex{Float64},
       )

julia> remove_bridge(
           model,
           MOI.Bridges.Constraint.NumberConversionBridge;
           coefficient_type = Complex{Float64},
       )
reshape_set(vectorized_set::MOI.AbstractSet, shape::AbstractShape)

Return a set in its original shape shape given its vectorized form vectorized_form.

Example

Given a SymmetricMatrixShape of vectorized form [1, 2, 3] in MOI.PositiveSemidefinieConeTriangle(2), the following code returns the set of the original constraint Symmetric(Matrix[1 2; 2 3]) in PSDCone():

julia> reshape_set(MOI.PositiveSemidefiniteConeTriangle(2), SymmetricMatrixShape(2))
PSDCone()
reshape_vector(vectorized_form::Vector, shape::AbstractShape)

Return an object in its original shape shape given its vectorized form vectorized_form.

Example

Given a SymmetricMatrixShape of vectorized form [1, 2, 3], the following code returns the matrix Symmetric(Matrix[1 2; 2 3]):

julia> reshape_vector([1, 2, 3], SymmetricMatrixShape(2))
2×2 LinearAlgebra.Symmetric{Int64, Matrix{Int64}}:
 1  2
 2  3
result_count(model::GenericModel)

Return the number of results available to query after a call to optimize!.

Example

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> result_count(model)
0
reverse_sense(::Val{T}) where {T}

Given an (in)equality symbol T, return a new Val object with the opposite (in)equality symbol.

This function is intended for use in JuMP extensions.

Example

julia> reverse_sense(Val(:>=))
Val{:<=}()
set_attribute(model::GenericModel, attr::MOI.AbstractModelAttribute, value)
set_attribute(x::GenericVariableRef, attr::MOI.AbstractVariableAttribute, value)
set_attribute(cr::ConstraintRef, attr::MOI.AbstractConstraintAttribute, value)

Set the value of a solver-specifc attribute attr to value.

This is equivalent to calling MOI.set with the associated MOI model and, for variables and constraints, with the associated MOI.VariableIndex or MOI.ConstraintIndex.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> @constraint(model, c, 2 * x <= 1)
c : 2 x ≤ 1

julia> set_attribute(model, MOI.Name(), "model_new")

julia> set_attribute(x, MOI.VariableName(), "x_new")

julia> set_attribute(c, MOI.ConstraintName(), "c_new")
set_attribute(
    model::Union{GenericModel,MOI.OptimizerWithAttributes},
    attr::Union{AbstractString,MOI.AbstractOptimizerAttribute},
    value,
)

Set the value of a solver-specifc attribute attr to value.

This is equivalent to calling MOI.set with the associated MOI model.

If attr is an AbstractString, it is converted to MOI.RawOptimizerAttribute.

Example

julia> import HiGHS

julia> opt = optimizer_with_attributes(HiGHS.Optimizer, "output_flag" => false);

julia> model = Model(opt);

julia> set_attribute(model, "output_flag", false)

julia> set_attribute(model, MOI.RawOptimizerAttribute("output_flag"), true)

julia> set_attribute(opt, "output_flag", true)

julia> set_attribute(opt, MOI.RawOptimizerAttribute("output_flag"), false)
set_attributes(
    destination::Union{
        GenericModel,
        MOI.OptimizerWithAttributes,
        GenericVariableRef,
        ConstraintRef,
    },
    pairs::Pair...,
)

Given a list of attribute => value pairs, calls set_attribute(destination, attribute, value) for each pair.

Example

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> set_attributes(model, "tol" => 1e-4, "max_iter" => 100)

is equivalent to:

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> set_attribute(model, "tol", 1e-4)

julia> set_attribute(model, "max_iter", 100)
set_binary(v::GenericVariableRef)

Add a constraint on the variable v that it must take values in the set .

Example

julia> model = Model();

julia> @variable(model, x);

julia> is_binary(x)
false

julia> set_binary(x)

julia> is_binary(x)
true
set_dual_start_value(con_ref::ConstraintRef, value)

Set the dual start value (MOI attribute ConstraintDualStart) of the constraint con_ref to value.

To remove a dual start value set it to nothing.

See also dual_start_value.

Example

julia> model = Model();

julia> @variable(model, x, start = 2.0);

julia> @constraint(model, c, [2x] in Nonnegatives())
c : [2 x] ∈ Nonnegatives()

julia> set_dual_start_value(c, [0.0])

julia> dual_start_value(c)
1-element Vector{Float64}:
 0.0

julia> set_dual_start_value(c, nothing)

julia> dual_start_value(c)
set_integer(variable_ref::GenericVariableRef)

Add an integrality constraint on the variable variable_ref.

Example

julia> model = Model();

julia> @variable(model, x);

julia> is_integer(x)
false

julia> set_integer(x)

julia> is_integer(x)
true
set_lower_bound(v::GenericVariableRef, lower::Number)

Set the lower bound of a variable. If one does not exist, create a new lower bound constraint.

Example

julia> model = Model();

julia> @variable(model, x >= 1.0);

julia> lower_bound(x)
1.0

julia> set_lower_bound(x, 2.0)

julia> lower_bound(x)
2.0
set_name(con_ref::ConstraintRef, s::AbstractString)

Set a constraint’s name attribute.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @constraint(model, c, [2x] in Nonnegatives())
c : [2 x] ∈ Nonnegatives()

julia> set_name(c, "my_constraint")

julia> name(c)
"my_constraint"

julia> c
my_constraint : [2 x] ∈ Nonnegatives()
set_name(v::GenericVariableRef, s::AbstractString)

Set a variable’s name attribute.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> set_name(x, "x_foo")

julia> x
x_foo

julia> name(x)
"x_foo"
set_nonlinear_dual_start_value(
    model::Model,
    start::Union{Nothing,Vector{Float64}},
)

Set the value of the MOI attribute MOI.NLPBlockDualStart.

Совместимость

This function is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling.

The start vector corresponds to the Lagrangian duals of the nonlinear constraints, in the order given by all_nonlinear_constraints. That is, you must pass a single start vector corresponding to all of the nonlinear constraints in a single function call; you cannot set the dual start value of nonlinear constraints one-by-one. The example below demonstrates how to use all_nonlinear_constraints to create a mapping between the nonlinear constraint references and the start vector.

Pass nothing to unset a previous start.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> nl1 = @NLconstraint(model, x[1] <= sqrt(x[2]));

julia> nl2 = @NLconstraint(model, x[1] >= exp(x[2]));

julia> start = Dict(nl1 => -1.0, nl2 => 1.0);

julia> start_vector = [start[con] for con in all_nonlinear_constraints(model)]
2-element Vector{Float64}:
 -1.0
  1.0

julia> set_nonlinear_dual_start_value(model, start_vector)

julia> nonlinear_dual_start_value(model)
2-element Vector{Float64}:
 -1.0
  1.0
set_nonlinear_objective(
    model::Model,
    sense::MOI.OptimizationSense,
    expr::Expr,
)

Set the nonlinear objective of model to the expression expr, with the optimization sense sense.

This function is most useful if the expression expr is generated programmatically, and you cannot use @NLobjective.

Совместимость

This function is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling.

Notes

  • You must interpolate the variables directly into the expression expr.

  • You must use MIN_SENSE or MAX_SENSE instead of Min and Max.

Example

julia> model = Model();

julia> @variable(model, x);

julia> set_nonlinear_objective(model, MIN_SENSE, :($(x) + $(x)^2))
set_normalized_coefficient(
    constraints::AbstractVector{<:ConstraintRef},
    variables_1:AbstractVector{<:GenericVariableRef},
    variables_2:AbstractVector{<:GenericVariableRef},
    values::AbstractVector{<:Number},
)

Set multiple quadratic coefficients associated with variables_1 and variables_2 in the constraints constraints to values.

Note that prior to this step, JuMP will aggregate multiple terms containing the same variable. For example, given a constraint 2x^2 + 3x^2 <= 2, set_normalized_coefficient(con, [x], [x], [4]) will create the constraint 4x^2 <= 2.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> @constraint(model, con, 2x[1]^2 + 3 * x[1] * x[2] + x[2] <= 2)
con : 2 x[1]² + 3 x[1]*x[2] + x[2] ≤ 2

julia> set_normalized_coefficient([con, con], [x[1], x[1]], [x[1], x[2]], [4, 5])

julia> con
con : 4 x[1]² + 5 x[1]*x[2] + x[2] ≤ 2
set_normalized_coefficient(
    constraints::AbstractVector{<:ConstraintRef},
    variables::AbstractVector{<:GenericVariableRef},
    values::AbstractVector{<:Number},
)

Set multiple coefficient of variables in the constraints constraints to values.

Note that prior to this step, JuMP will aggregate multiple terms containing the same variable. For example, given a constraint 2x + 3x <= 2, set_normalized_coefficient(con, [x], [4]) will create the constraint 4x <= 2.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> @variable(model, y)
y

julia> @constraint(model, con, 2x + 3x + 4y <= 2)
con : 5 x + 4 y ≤ 2

julia> set_normalized_coefficient([con, con], [x, y], [6, 7])

julia> con
con : 6 x + 7 y ≤ 2
set_normalized_coefficient(
    constraint::ConstraintRef,
    variable_1:GenericVariableRef,
    variable_2:GenericVariableRef,
    value::Number,
)

Set the quadratic coefficient associated with variable_1 and variable_2 in the constraint constraint to value.

Note that prior to this step, JuMP will aggregate multiple terms containing the same variable. For example, given a constraint 2x^2 + 3x^2 <= 2, set_normalized_coefficient(con, x, x, 4) will create the constraint 4x^2 <= 2.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> @constraint(model, con, 2x[1]^2 + 3 * x[1] * x[2] + x[2] <= 2)
con : 2 x[1]² + 3 x[1]*x[2] + x[2] ≤ 2

julia> set_normalized_coefficient(con, x[1], x[1], 4)

julia> set_normalized_coefficient(con, x[1], x[2], 5)

julia> con
con : 4 x[1]² + 5 x[1]*x[2] + x[2] ≤ 2
set_normalized_coefficient(
    con_ref::ConstraintRef,
    variable::AbstractVariableRef,
    new_coefficients::Vector{Tuple{Int64,T}},
)

Set the coefficients of variable in the constraint con_ref to new_coefficients, where each element in new_coefficients is a tuple which maps the row to a new coefficient.

Note that prior to this step, during constraint creation, JuMP will aggregate multiple terms containing the same variable.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> @constraint(model, con, [2x + 3x, 4x] in MOI.Nonnegatives(2))
con : [5 x, 4 x] ∈ MathOptInterface.Nonnegatives(2)

julia> set_normalized_coefficient(con, x, [(1, 2.0), (2, 5.0)])

julia> con
con : [2 x, 5 x] ∈ MathOptInterface.Nonnegatives(2)
set_normalized_coefficient(
    constraint::ConstraintRef,
    variable::GenericVariableRef,
    value::Number,
)

Set the coefficient of variable in the constraint constraint to value.

Note that prior to this step, JuMP will aggregate multiple terms containing the same variable. For example, given a constraint 2x + 3x <= 2, set_normalized_coefficient(con, x, 4) will create the constraint 4x <= 2.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> @constraint(model, con, 2x + 3x <= 2)
con : 5 x ≤ 2

julia> set_normalized_coefficient(con, x, 4)

julia> con
con : 4 x ≤ 2
set_normalized_coefficients(
    constraint::ConstraintRef{<:AbstractModel,<:MOI.ConstraintIndex{F}},
    variable::AbstractVariableRef,
    new_coefficients::Vector{Tuple{Int64,T}},
) where {T,F<:Union{MOI.VectorAffineFunction{T},MOI.VectorQuadraticFunction{T}}}

A deprecated method that now redirects to set_normalized_coefficient.

set_normalized_rhs(
    constraints::AbstractVector{<:ConstraintRef},
    values::AbstractVector{<:Number}
)

Set the right-hand side terms of all constraints to values.

Note that prior to this step, JuMP will aggregate all constant terms onto the right-hand side of the constraint. For example, given a constraint 2x + 1 <= 2, set_normalized_rhs([con], [4]) will create the constraint 2x <= 4, not 2x + 1 <= 4.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @constraint(model, con1, 2x + 1 <= 2)
con1 : 2 x ≤ 1

julia> @constraint(model, con2, 3x + 2 <= 4)
con2 : 3 x ≤ 2

julia> set_normalized_rhs([con1, con2], [4, 5])

julia> con1
con1 : 2 x ≤ 4

julia> con2
con2 : 3 x ≤ 5
set_normalized_rhs(constraint::ConstraintRef, value::Number)

Set the right-hand side term of constraint to value.

Note that prior to this step, JuMP will aggregate all constant terms onto the right-hand side of the constraint. For example, given a constraint 2x + 1 <= 2, set_normalized_rhs(con, 4) will create the constraint 2x <= 4, not 2x + 1 <= 4.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @constraint(model, con, 2x + 1 <= 2)
con : 2 x ≤ 1

julia> set_normalized_rhs(con, 4)

julia> con
con : 2 x ≤ 4
set_objective(model::AbstractModel, sense::MOI.OptimizationSense, func)

The functional equivalent of the @objective macro.

Sets the objective sense and objective function simultaneously, and is equivalent to calling set_objective_sense and set_objective_function separately.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> set_objective(model, MIN_SENSE, x)
set_objective_coefficient(
    model::GenericModel{T},
    variables_1::AbstractVector{<:GenericVariableRef{T}},
    variables_2::AbstractVector{<:GenericVariableRef{T}},
    coefficients::AbstractVector{<:Real},
) where {T}

Set multiple quadratic objective coefficients associated with variables_1 and variables_2 to coefficients, in a single call.

this function will throw an error if a nonlinear objective is set.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> @objective(model, Min, x[1]^2 + x[1] * x[2])
x[1]² + x[1]*x[2]

julia> set_objective_coefficient(model, [x[1], x[1]], [x[1], x[2]], [2, 3])

julia> objective_function(model)
2 x[1]² + 3 x[1]*x[2]
set_objective_coefficient(
    model::GenericModel,
    variables::Vector{<:GenericVariableRef},
    coefficients::Vector{<:Real},
)

Set multiple linear objective coefficients associated with variables to coefficients, in a single call.

this function will throw an error if a nonlinear objective is set.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @variable(model, y);

julia> @objective(model, Min, 3x + 2y + 1)
3 x + 2 y + 1

julia> set_objective_coefficient(model, [x, y], [5, 4])

julia> objective_function(model)
5 x + 4 y + 1
set_objective_coefficient(
    model::GenericModel{T},
    variable_1::GenericVariableRef{T},
    variable_2::GenericVariableRef{T},
    coefficient::Real,
) where {T}

Set the quadratic objective coefficient associated with variable_1 and variable_2 to coefficient.

this function will throw an error if a nonlinear objective is set.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> @objective(model, Min, x[1]^2 + x[1] * x[2])
x[1]² + x[1]*x[2]

julia> set_objective_coefficient(model, x[1], x[1], 2)

julia> set_objective_coefficient(model, x[1], x[2], 3)

julia> objective_function(model)
2 x[1]² + 3 x[1]*x[2]
set_objective_coefficient(
    model::GenericModel,
    variable::GenericVariableRef,
    coefficient::Real,
)

Set the linear objective coefficient associated with variable to coefficient.

this function will throw an error if a nonlinear objective is set.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @objective(model, Min, 2x + 1)
2 x + 1

julia> set_objective_coefficient(model, x, 3)

julia> objective_function(model)
3 x + 1
set_objective_function(model::GenericModel, func::MOI.AbstractFunction)
set_objective_function(model::GenericModel, func::AbstractJuMPScalar)
set_objective_function(model::GenericModel, func::Real)
set_objective_function(model::GenericModel, func::Vector{<:AbstractJuMPScalar})

Sets the objective function of the model to the given function.

See set_objective_sense to set the objective sense.

These are low-level functions; the recommended way to set the objective is with the @objective macro.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @objective(model, Min, x);

julia> objective_function(model)
x

julia> set_objective_function(model, 2 * x + 1)

julia> objective_function(model)
2 x + 1
set_objective_sense(model::GenericModel, sense::MOI.OptimizationSense)

Sets the objective sense of the model to the given sense.

See set_objective_function to set the objective function.

These are low-level functions; the recommended way to set the objective is with the @objective macro.

Example

julia> model = Model();

julia> objective_sense(model)
FEASIBILITY_SENSE::OptimizationSense = 2

julia> set_objective_sense(model, MOI.MAX_SENSE)

julia> objective_sense(model)
MAX_SENSE::OptimizationSense = 1
set_optimize_hook(model::GenericModel, f::Union{Function,Nothing})

Set the function f as the optimize hook for model.

f should have a signature f(model::GenericModel; kwargs...), where the kwargs are those passed to optimize!.

Notes

  • The optimize hook should generally modify the model, or some external state in some way, and then call optimize!(model; ignore_optimize_hook = true) to optimize the problem, bypassing the hook.

  • Use set_optimize_hook(model, nothing) to unset an optimize hook.

Example

julia> model = Model();

julia> function my_hook(model::Model; kwargs...)
           println(kwargs)
           println("Calling with `ignore_optimize_hook = true`")
           optimize!(model; ignore_optimize_hook = true)
           return
       end
my_hook (generic function with 1 method)

julia> set_optimize_hook(model, my_hook)
my_hook (generic function with 1 method)

julia> optimize!(model; test_arg = true)
Base.Pairs{Symbol, Bool, Tuple{Symbol}, @NamedTuple{test_arg::Bool}}(:test_arg => 1)
Calling with `ignore_optimize_hook = true`
ERROR: NoOptimizer()
[...]
set_optimizer(
    model::GenericModel,
    optimizer_factory;
    add_bridges::Bool = true,
)

Creates an empty MathOptInterface.AbstractOptimizer instance by calling optimizer_factory() and sets it as the optimizer of model. Specifically, optimizer_factory must be callable with zero arguments and return an empty MathOptInterface.AbstractOptimizer.

If add_bridges is true, constraints and objectives that are not supported by the optimizer are automatically bridged to equivalent supported formulation. Passing add_bridges = false can improve performance if the solver natively supports all of the elements in model.

See set_attribute for setting solver-specific parameters of the optimizer.

Example

julia> import HiGHS

julia> model = Model();

julia> set_optimizer(model, () -> HiGHS.Optimizer())

julia> set_optimizer(model, HiGHS.Optimizer; add_bridges = false)
set_optimizer_attribute(
    model::Union{GenericModel,MOI.OptimizerWithAttributes},
    attr::Union{AbstractString,MOI.AbstractOptimizerAttribute},
    value,
)

Set the solver-specific attribute attr in model to value.

If attr is an AbstractString, this is equivalent to set_optimizer_attribute(model, MOI.RawOptimizerAttribute(name), value).

Совместимость

This method will remain in all v1.X releases of JuMP, but it may be removed in a future v2.0 release. We recommend using set_attribute instead.

Example

julia> model = Model();

julia> set_optimizer_attribute(model, MOI.Silent(), true)
set_optimizer_attributes(
    model::Union{GenericModel,MOI.OptimizerWithAttributes},
    pairs::Pair...,
)

Given a list of attribute => value pairs, calls set_optimizer_attribute(model, attribute, value) for each pair.

Совместимость

This method will remain in all v1.X releases of JuMP, but it may be removed in a future v2.0 release. We recommend using set_attributes instead.

Example

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> set_optimizer_attributes(model, "tol" => 1e-4, "max_iter" => 100)

is equivalent to:

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> set_optimizer_attribute(model, "tol", 1e-4)

julia> set_optimizer_attribute(model, "max_iter", 100)
set_parameter_value(x::GenericVariableRef, value)

Update the parameter constraint on the variable x to value.

Errors if x is not a parameter.

Example

julia> model = Model();

julia> @variable(model, p in Parameter(2))
p

julia> parameter_value(p)
2.0

julia> set_parameter_value(p, 2.5)

julia> parameter_value(p)
2.5
set_silent(model::GenericModel)

Takes precedence over any other attribute controlling verbosity and requires the solver to produce no output.

See also: unset_silent.

Example

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> set_silent(model)

julia> get_attribute(model, MOI.Silent())
true

julia> unset_silent(model)

julia> get_attribute(model, MOI.Silent())
false
set_start_value(con_ref::ConstraintRef, value)

Set the primal start value (MOI.ConstraintPrimalStart) of the constraint con_ref to value.

To remove a primal start value set it to nothing.

See also start_value.

Example

julia> model = Model();

julia> @variable(model, x, start = 2.0);

julia> @constraint(model, c, [2x] in Nonnegatives())
c : [2 x] ∈ Nonnegatives()

julia> set_start_value(c, [4.0])

julia> start_value(c)
1-element Vector{Float64}:
 4.0

julia> set_start_value(c, nothing)

julia> start_value(c)
set_start_value(variable::GenericVariableRef, value::Union{Real,Nothing})

Set the start value (MOI.VariablePrimalStart) of the variable to value.

Pass nothing to unset the start value.

VariablePrimalStarts are sometimes called "MIP-starts" or "warmstarts".

Example

julia> model = Model();

julia> @variable(model, x, start = 1.5);

julia> @variable(model, y);

julia> has_start_value(x)
true

julia> has_start_value(y)
false

julia> start_value(x)
1.5

julia> set_start_value(x, nothing)

julia> has_start_value(x)
false

julia> set_start_value(y, 2.0)

julia> has_start_value(y)
true

julia> start_value(y)
2.0
set_start_values(
    model::GenericModel;
    variable_primal_start::Union{Nothing,Function} = value,
    constraint_primal_start::Union{Nothing,Function} = value,
    constraint_dual_start::Union{Nothing,Function} = dual,
    nonlinear_dual_start::Union{Nothing,Function} = nonlinear_dual_start_value,
)

Set the primal and dual starting values in model using the functions provided.

If any keyword argument is nothing, the corresponding start value is skipped.

If the optimizer does not support setting the starting value, the value will be skipped.

variable_primal_start

This function controls the primal starting solution for the variables. It is equivalent to calling set_start_value](api.md#JuMP.set_start_value-Tuple{ConstraintRef{<:AbstractModel, <:MathOptInterface.ConstraintIndex{<:MathOptInterface.AbstractVectorFunction, <:MathOptInterface.AbstractVectorSet}}, Any}) for each variable, or setting the [MOI.VariablePrimalStart attribute.

If it is a function, it must have the form variable_primal_start(x::VariableRef) that maps each variable x to the starting primal value.

The default is value.

constraint_primal_start

This function controls the primal starting solution for the constraints. It is equivalent to calling set_start_value](api.md#JuMP.set_start_value-Tuple{ConstraintRef{<:AbstractModel, <:MathOptInterface.ConstraintIndex{<:MathOptInterface.AbstractVectorFunction, <:MathOptInterface.AbstractVectorSet}}, Any}) for each constraint, or setting the [MOI.ConstraintPrimalStart attribute.

If it is a function, it must have the form constraint_primal_start(ci::ConstraintRef) that maps each constraint ci to the starting primal value.

The default is value.

constraint_dual_start

This function controls the dual starting solution for the constraints. It is equivalent to calling set_dual_start_value](api.md#JuMP.set_dual_start_value-Tuple{ConstraintRef{<:AbstractModel, <:MathOptInterface.ConstraintIndex{<:MathOptInterface.AbstractVectorFunction, <:MathOptInterface.AbstractVectorSet}}, Any}) for each constraint, or setting the [MOI.ConstraintDualStart attribute.

If it is a function, it must have the form constraint_dual_start(ci::ConstraintRef) that maps each constraint ci to the starting dual value.

The default is dual.

nonlinear_dual_start

This function controls the dual starting solution for the nonlinear constraints It is equivalent to calling set_nonlinear_dual_start_value.

If it is a function, it must have the form nonlinear_dual_start(model::GenericModel) that returns a vector corresponding to the dual start of the constraints.

The default is nonlinear_dual_start_value.

set_string_names_on_creation(model::GenericModel, value::Bool)

Set the default argument of the set_string_name keyword in the @variable and @constraint macros to value.

The set_string_name keyword is used to determine whether to assign String names to all variables and constraints in model.

By default, value is true. However, for larger models calling set_string_names_on_creation(model, false) can improve performance at the cost of reducing the readability of printing and solver log messages.

Example

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_string_names_on_creation(model)
true

julia> set_string_names_on_creation(model, false)

julia> set_string_names_on_creation(model)
false
set_time_limit_sec(model::GenericModel, limit::Float64)

Set the time limit (in seconds) of the solver.

Can be unset using unset_time_limit_sec or with limit set to nothing.

Example

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> time_limit_sec(model)

julia> set_time_limit_sec(model, 60.0)

julia> time_limit_sec(model)
60.0

julia> unset_time_limit_sec(model)

julia> time_limit_sec(model)
set_upper_bound(v::GenericVariableRef, upper::Number)

Set the upper bound of a variable. If one does not exist, create an upper bound constraint.

Example

julia> model = Model();

julia> @variable(model, x <= 1.0);

julia> upper_bound(x)
1.0

julia> set_upper_bound(x, 2.0)

julia> upper_bound(x)
2.0
set_value(p::NonlinearParameter, v::Number)

Store the value v in the nonlinear parameter p.

Совместимость

This function is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling.

Example

julia> model = Model();

julia> @NLparameter(model, p == 0)
p == 0.0

julia> set_value(p, 5)
5

julia> value(p)
5.0
shadow_price(con_ref::ConstraintRef)

Return the change in the objective from an infinitesimal relaxation of the constraint.

The shadow price is computed from dual and can be queried only when has_duals is true and the objective sense is MIN_SENSE or MAX_SENSE (not FEASIBILITY_SENSE).

See also reduced_cost.

Comparison to dual

The shadow prices differ at most in sign from the dual value depending on the objective sense. The differences are summarized in the table:

Min Max

f(x) <= b

+1

-1

f(x) >= b

-1

+1

Notes

  • The function simply translates signs from dual and does not validate the conditions needed to guarantee the sensitivity interpretation of the shadow price. The caller is responsible, for example, for checking whether the solver converged to an optimal primal-dual pair or a proof of infeasibility.

  • The computation is based on the current objective sense of the model. If this has changed since the last solve, the results will be incorrect.

  • Relaxation of equality constraints (and hence the shadow price) is defined based on which sense of the equality constraint is active.

Example

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_silent(model)

julia> @variable(model, x);

julia> @constraint(model, c, x <= 1)
c : x ≤ 1

julia> @objective(model, Max, 2 * x + 1);

julia> optimize!(model)

julia> has_duals(model)
true

julia> shadow_price(c)
2.0
shape(c::AbstractConstraint)::AbstractShape

Return the shape of the constraint c.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> c = @constraint(model, x[2] <= 1);

julia> shape(constraint_object(c))
ScalarShape()

julia> d = @constraint(model, x in SOS1());

julia> shape(constraint_object(d))
VectorShape()
show_backend_summary(io::IO, model::GenericModel)

Print a summary of the optimizer backing model.

Extensions

AbstractModels should implement this method.

Example

julia> model = Model();

julia> show_backend_summary(stdout, model)
Model mode: AUTOMATIC
CachingOptimizer state: NO_OPTIMIZER
Solver name: No optimizer attached.
show_constraints_summary(io::IO, model::AbstractModel)

Write to io a summary of the number of constraints.

Extensions

AbstractModels should implement this method.

Example

julia> model = Model();

julia> @variable(model, x >= 0);

julia> show_constraints_summary(stdout, model)
`VariableRef`-in-`MathOptInterface.GreaterThan{Float64}`: 1 constraint
show_objective_function_summary(io::IO, model::AbstractModel)

Write to io a summary of the objective function type.

Extensions

AbstractModels should implement this method.

Example

julia> model = Model();

julia> show_objective_function_summary(stdout, model)
Objective function type: AffExpr
simplex_iterations(model::GenericModel)

If available, returns the cumulative number of simplex iterations during the most-recent optimization (the MOI.SimplexIterations attribute).

Throws a MOI.GetAttributeNotAllowed error if the attribute is not implemented by the solver.

Example

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_silent(model)

julia> optimize!(model)

julia> simplex_iterations(model)
0
solution_summary(model::GenericModel; result::Int = 1, verbose::Bool = false)

Return a struct that can be used print a summary of the solution in result result.

If verbose=true, write out the primal solution for every variable and the dual solution for every constraint, excluding those with empty names.

Example

When called at the REPL, the summary is automatically printed:

julia> model = Model();

julia> solution_summary(model)
* Solver : No optimizer attached.

* Status
  Result count       : 0
  Termination status : OPTIMIZE_NOT_CALLED
  Message from the solver:
  "optimize not called"

* Candidate solution (result #1)
  Primal status      : NO_SOLUTION
  Dual status        : NO_SOLUTION

* Work counters

Use print to force the printing of the summary from inside a function:

julia> model = Model();

julia> function foo(model)
           print(solution_summary(model))
           return
       end
foo (generic function with 1 method)

julia> foo(model)
* Solver : No optimizer attached.

* Status
  Result count       : 0
  Termination status : OPTIMIZE_NOT_CALLED
  Message from the solver:
  "optimize not called"

* Candidate solution (result #1)
  Primal status      : NO_SOLUTION
  Dual status        : NO_SOLUTION

* Work counters
solve_time(model::GenericModel)

If available, returns the solve time in wall-clock seconds reported by the solver (the MOI.SolveTimeSec attribute).

Throws a MOI.GetAttributeNotAllowed error if the attribute is not implemented by the solver.

Example

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_silent(model)

julia> optimize!(model)

julia> solve_time(model)
1.0488089174032211e-5
solver_name(model::GenericModel)

If available, returns the MOI.SolverName property of the underlying optimizer.

Returns "No optimizer attached." in AUTOMATIC or MANUAL modes when no optimizer is attached.

Returns "SolverName() attribute not implemented by the optimizer." if the attribute is not implemented.

Example

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> solver_name(model)
"Ipopt"

julia> model = Model();

julia> solver_name(model)
"No optimizer attached."

julia> model = Model(MOI.FileFormats.MPS.Model);

julia> solver_name(model)
"SolverName() attribute not implemented by the optimizer."
start_value(con_ref::ConstraintRef)

Return the primal start value (MOI.ConstraintPrimalStart) of the constraint con_ref.

If no primal start value has been set, start_value will return nothing.

See also set_start_value.

Example

julia> model = Model();

julia> @variable(model, x, start = 2.0);

julia> @constraint(model, c, [2x] in Nonnegatives())
c : [2 x] ∈ Nonnegatives()

julia> set_start_value(c, [4.0])

julia> start_value(c)
1-element Vector{Float64}:
 4.0

julia> set_start_value(c, nothing)

julia> start_value(c)
start_value(v::GenericVariableRef)

Return the start value (MOI.VariablePrimalStart) of the variable v.

VariablePrimalStarts are sometimes called "MIP-starts" or "warmstarts".

Example

julia> model = Model();

julia> @variable(model, x, start = 1.5);

julia> @variable(model, y);

julia> has_start_value(x)
true

julia> has_start_value(y)
false

julia> start_value(x)
1.5

julia> set_start_value(y, 2.0)

julia> has_start_value(y)
true

julia> start_value(y)
2.0
termination_status(model::GenericModel)

Return a MOI.TerminationStatusCode describing why the solver stopped (that is, the MOI.TerminationStatus attribute).

Example

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> termination_status(model)
OPTIMIZE_NOT_CALLED::TerminationStatusCode = 0
time_limit_sec(model::GenericModel)

Return the time limit (in seconds) of the model.

Returns nothing if unset.

Example

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> time_limit_sec(model)

julia> set_time_limit_sec(model, 60.0)

julia> time_limit_sec(model)
60.0

julia> unset_time_limit_sec(model)

julia> time_limit_sec(model)
triangle_vec(matrix::Matrix)

Return the upper triangle of a matrix concatenated into a vector in the order required by JuMP and MathOptInterface for Triangle sets.

Example

julia> model = Model();

julia> @variable(model, X[1:3, 1:3], Symmetric);

julia> @variable(model, t)
t

julia> @constraint(model, [t; triangle_vec(X)] in MOI.RootDetConeTriangle(3))
[t, X[1,1], X[1,2], X[2,2], X[1,3], X[2,3], X[3,3]] ∈ MathOptInterface.RootDetConeTriangle(3)
unfix(v::GenericVariableRef)

Delete the fixing constraint of a variable.

Error if one does not exist.

See also FixRef, is_fixed, fix_value, fix.

Example

julia> model = Model();

julia> @variable(model, x == 1);

julia> is_fixed(x)
true

julia> unfix(x)

julia> is_fixed(x)
false
unregister(model::GenericModel, key::Symbol)

Unregister the name key from model so that a new variable, constraint, or expression can be created with the same key.

Note that this will not delete the object model[key]; it will just remove the reference at model[key]. To delete the object, use delete as well.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> @variable(model, x)
ERROR: An object of name x is already attached to this model. If this
    is intended, consider using the anonymous construction syntax, for example,
    `x = @variable(model, [1:N], ...)` where the name of the object does
    not appear inside the macro.

    Alternatively, use `unregister(model, :x)` to first unregister
    the existing name from the model. Note that this will not delete the
    object; it will just remove the reference at `model[:x]`.

Stacktrace:
[...]

julia> num_variables(model)
1

julia> unregister(model, :x)

julia> @variable(model, x)
x

julia> num_variables(model)
2
unsafe_backend(model::GenericModel)

Return the innermost optimizer associated with the JuMP model model.

This function should only be used by advanced users looking to access low-level solver-specific functionality. It has a high-risk of incorrect usage. We strongly suggest you use the alternative suggested below.

See also: backend.

To obtain the index of a variable or constraint in the unsafe backend, use optimizer_index.

Unsafe behavior

This function is unsafe for two main reasons.

First, the formulation and order of variables and constraints in the unsafe backend may be different to the variables and constraints in model. This can happen because of bridges, or because the solver requires the variables or constraints in a specific order. In addition, the variable or constraint index returned by index at the JuMP level may be different to the index of the corresponding variable or constraint in the unsafe_backend. There is no solution to this. Use the alternative suggested below instead.

Second, the unsafe_backend may be empty, or lack some modifications made to the JuMP model. Thus, before calling unsafe_backend you should first call MOI.Utilities.attach_optimizer to ensure that the backend is synchronized with the JuMP model.

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer)
A JuMP Model
├ solver: HiGHS
├ objective_sense: FEASIBILITY_SENSE
├ num_variables: 0
├ num_constraints: 0
└ Names registered in the model: none

julia> MOI.Utilities.attach_optimizer(model)

julia> inner = unsafe_backend(model)
A HiGHS model with 0 columns and 0 rows.

Moreover, if you modify the JuMP model, the reference you have to the backend (that is, inner in the example above) may be out-dated, and you should call MOI.Utilities.attach_optimizer again.

This function is also unsafe in the reverse direction: if you modify the unsafe backend, for example, by adding a new constraint to inner, the changes may be silently discarded by JuMP when the JuMP model is modified or solved.

Alternative

Instead of unsafe_backend, create a model using direct_model and call backend instead.

For example, instead of:

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_silent(model)

julia> @variable(model, x >= 0)
x

julia> MOI.Utilities.attach_optimizer(model)

julia> highs = unsafe_backend(model)
A HiGHS model with 1 columns and 0 rows.

julia> optimizer_index(x)
MOI.VariableIndex(1)

Use:

julia> import HiGHS

julia> model = direct_model(HiGHS.Optimizer());

julia> set_silent(model)

julia> @variable(model, x >= 0)
x

julia> highs = backend(model)  # No need to call `attach_optimizer`.
A HiGHS model with 1 columns and 0 rows.

julia> index(x)
MOI.VariableIndex(1)
unset_binary(variable_ref::GenericVariableRef)

Remove the binary constraint on the variable variable_ref.

Example

julia> model = Model();

julia> @variable(model, x, Bin);

julia> is_binary(x)
true

julia> unset_binary(x)

julia> is_binary(x)
false
unset_integer(variable_ref::GenericVariableRef)

Remove the integrality constraint on the variable variable_ref.

Errors if one does not exist.

Example

julia> model = Model();

julia> @variable(model, x, Int);

julia> is_integer(x)
true

julia> unset_integer(x)

julia> is_integer(x)
false
unset_silent(model::GenericModel)

Neutralize the effect of the set_silent function and let the solver attributes control the verbosity.

See also: set_silent.

Example

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> set_silent(model)

julia> get_attribute(model, MOI.Silent())
true

julia> unset_silent(model)

julia> get_attribute(model, MOI.Silent())
false
unset_time_limit_sec(model::GenericModel)

Unset the time limit of the solver.

Example

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> time_limit_sec(model)

julia> set_time_limit_sec(model, 60.0)

julia> time_limit_sec(model)
60.0

julia> unset_time_limit_sec(model)

julia> time_limit_sec(model)
upper_bound(v::GenericVariableRef)

Return the upper bound of a variable.

Error if one does not exist.

Example

julia> model = Model();

julia> @variable(model, x <= 1.0);

julia> upper_bound(x)
1.0
value(con_ref::ConstraintRef; result::Int = 1)

Return the primal value of constraint con_ref associated with result index result of the most-recent solution returned by the solver.

That is, if con_ref is the reference of a constraint func-in-set, it returns the value of func evaluated at the value of the variables (given by value(::GenericVariableRef)).

Use has_values to check if a result exists before asking for values.

See also: result_count.

Note

For scalar constraints, the constant is moved to the set so it is not taken into account in the primal value of the constraint. For instance, the constraint @constraint(model, 2x + 3y + 1 == 5) is transformed into 2x + 3y-in-MOI.EqualTo(4) so the value returned by this function is the evaluation of 2x + 3y.

value(var_value::Function, con_ref::ConstraintRef)

Evaluate the primal value of the constraint con_ref using var_value(v) as the value for each variable v.

value(var_value::Function, v::GenericVariableRef)

Evaluate the value of the variable v as var_value(v).

value(var_value::Function, c::NonlinearConstraintRef)

Evaluate c using var_value(v) as the value for each variable v.

value(var_value::Function, ex::NonlinearExpression)

Evaluate ex using var_value(v) as the value for each variable v.

value(v::GenericAffExpr; result::Int = 1)

Return the value of the GenericAffExpr v associated with result index result of the most-recent solution returned by the solver.

See also: result_count.

value(v::GenericQuadExpr; result::Int = 1)

Return the value of the GenericQuadExpr v associated with result index result of the most-recent solution returned by the solver.

Replaces getvalue for most use cases.

See also: result_count.

value(c::NonlinearConstraintRef; result::Int = 1)

Return the value of the NonlinearConstraintRef c associated with result index result of the most-recent solution returned by the solver.

See also: result_count.

value(ex::NonlinearExpression; result::Int = 1)

Return the value of the NonlinearExpression ex associated with result index result of the most-recent solution returned by the solver.

See also: result_count.

value(p::NonlinearParameter)

Return the current value stored in the nonlinear parameter p.

Example

julia> model = Model();

julia> @NLparameter(model, p == 10)
p == 10.0

julia> value(p)
10.0
value(v::GenericVariableRef; result = 1)

Return the value of variable v associated with result index result of the most-recent returned by the solver.

Use has_values to check if a result exists before asking for values.

See also: result_count.

value(var_value::Function, ex::GenericQuadExpr)

Evaluate ex using var_value(v) as the value for each variable v.

value(var_value::Function, ex::GenericAffExpr)

Evaluate ex using var_value(v) as the value for each variable v.

value_type(::Type{<:Union{AbstractModel,AbstractVariableRef}})

Return the return type of value for variables of that model. It defaults to Float64 if it is not implemented.

Example

julia> value_type(GenericModel{BigFloat})
BigFloat
variable_by_name(
    model::AbstractModel,
    name::String,
)::Union{AbstractVariableRef,Nothing}

Returns the reference of the variable with name attribute name or Nothing if no variable has this name attribute. Throws an error if several variables have name as their name attribute.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> variable_by_name(model, "x")
x

julia> @variable(model, base_name="x")
x

julia> variable_by_name(model, "x")
ERROR: Multiple variables have the name x.
Stacktrace:
 [1] error(::String) at ./error.jl:33
 [2] get(::MOIU.Model{Float64}, ::Type{MathOptInterface.VariableIndex}, ::String) at /home/blegat/.julia/dev/MathOptInterface/src/Utilities/model.jl:222
 [3] get at /home/blegat/.julia/dev/MathOptInterface/src/Utilities/universalfallback.jl:201 [inlined]
 [4] get(::MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.AbstractOptimizer,MathOptInterface.Utilities.UniversalFallback{MOIU.Model{Float64}}}, ::Type{MathOptInterface.VariableIndex}, ::String) at /home/blegat/.julia/dev/MathOptInterface/src/Utilities/cachingoptimizer.jl:490
 [5] variable_by_name(::GenericModel, ::String) at /home/blegat/.julia/dev/JuMP/src/variables.jl:268
 [6] top-level scope at none:0

julia> var = @variable(model, base_name="y")
y

julia> variable_by_name(model, "y")
y

julia> set_name(var, "z")

julia> variable_by_name(model, "y")

julia> variable_by_name(model, "z")
z

julia> @variable(model, u[1:2])
2-element Vector{VariableRef}:
 u[1]
 u[2]

julia> variable_by_name(model, "u[2]")
u[2]
variable_ref_type(::Union{F,Type{F}}) where {F}

A helper function used internally by JuMP and some JuMP extensions. Returns the variable type associated with the model or expression type F.

vectorize(matrix::AbstractMatrix, ::Shape)

Convert the matrix into a vector according to Shape.

write_to_file(
    model::GenericModel,
    filename::String;
    format::MOI.FileFormats.FileFormat = MOI.FileFormats.FORMAT_AUTOMATIC,
    kwargs...,
)

Write the JuMP model model to filename in the format format.

If the filename ends in .gz, it will be compressed using GZip. If the filename ends in .bz2, it will be compressed using BZip2.

Other kwargs are passed to the Model constructor of the chosen format.

MOIU.attach_optimizer(model::GenericModel)

Call MOIU.attach_optimizer on the backend of model.

Cannot be called in direct mode.

MOIU.drop_optimizer(model::GenericModel)

Call MOIU.drop_optimizer on the backend of model.

Cannot be called in direct mode.

MOIU.reset_optimizer(model::GenericModel, optimizer::MOI.AbstractOptimizer)

Call MOIU.reset_optimizer on the backend of model.

Cannot be called in direct mode.

MOIU.reset_optimizer(model::GenericModel)

Call MOIU.reset_optimizer on the backend of model.

Cannot be called in direct mode.

get(model::GenericModel, attr::MathOptInterface.AbstractModelAttribute)

Return the value of the attribute attr from the model’s MOI backend.

get(model::GenericModel, attr::MathOptInterface.AbstractOptimizerAttribute)

Return the value of the attribute attr from the model’s MOI backend.

@NLconstraint(model::GenericModel, expr)

Add a constraint described by the nonlinear expression expr. See also @constraint.

Совместимость

This macro is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling. In most cases, you can replace @NLconstraint with @constraint.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> @NLconstraint(model, sin(x) <= 1)
sin(x) - 1.0 ≤ 0

julia> @NLconstraint(model, [i = 1:3], sin(i * x) <= 1 / i)
3-element Vector{NonlinearConstraintRef{ScalarShape}}:
 (sin(1.0 * x) - 1.0 / 1.0) - 0.0 ≤ 0
 (sin(2.0 * x) - 1.0 / 2.0) - 0.0 ≤ 0
 (sin(3.0 * x) - 1.0 / 3.0) - 0.0 ≤ 0
@NLconstraints(model, args...)

Adds multiple nonlinear constraints to model at once, in the same fashion as the @NLconstraint macro.

The model must be the first argument, and multiple constraints can be added on multiple lines wrapped in a begin ... end block.

The macro returns a tuple containing the constraints that were defined.

Совместимость

This macro is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling. In most cases, you can replace @NLconstraints with @constraints.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @variable(model, y);

julia> @variable(model, t);

julia> @variable(model, z[1:2]);

julia> a = [4, 5];

julia> @NLconstraints(model, begin
           t >= sqrt(x^2 + y^2)
           [i = 1:2], z[i] <= log(a[i])
       end)
((t - sqrt(x ^ 2.0 + y ^ 2.0)) - 0.0 ≥ 0, NonlinearConstraintRef{ScalarShape}[(z[1] - log(4.0)) - 0.0 ≤ 0, (z[2] - log(5.0)) - 0.0 ≤ 0])
@NLexpression(args...)

Efficiently build a nonlinear expression which can then be inserted in other nonlinear constraints and the objective. See also [@expression].

Совместимость

This macro is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling. In most cases, you can replace @NLexpression with @expression.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> @variable(model, y)
y

julia> @NLexpression(model, my_expr, sin(x)^2 + cos(x^2))
subexpression[1]: sin(x) ^ 2.0 + cos(x ^ 2.0)

julia> @NLconstraint(model, my_expr + y >= 5)
(subexpression[1] + y) - 5.0 ≥ 0

julia> @NLobjective(model, Min, my_expr)

Indexing over sets and anonymous expressions are also supported:

julia> @NLexpression(model, my_expr_1[i=1:3], sin(i * x))
3-element Vector{NonlinearExpression}:
 subexpression[2]: sin(1.0 * x)
 subexpression[3]: sin(2.0 * x)
 subexpression[4]: sin(3.0 * x)

julia> my_expr_2 = @NLexpression(model, log(1 + sum(exp(my_expr_1[i]) for i in 1:2)))
subexpression[5]: log(1.0 + (exp(subexpression[2]) + exp(subexpression[3])))
@NLexpressions(model, args...)

Adds multiple nonlinear expressions to model at once, in the same fashion as the @NLexpression macro.

The model must be the first argument, and multiple expressions can be added on multiple lines wrapped in a begin ... end block.

The macro returns a tuple containing the expressions that were defined.

Совместимость

This macro is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling. In most cases, you can replace @NLexpressions with @expressions.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @variable(model, y);

julia> @variable(model, z[1:2]);

julia> a = [4, 5];

julia> @NLexpressions(model, begin
           my_expr, sqrt(x^2 + y^2)
           my_expr_1[i = 1:2], log(a[i]) - z[i]
       end)
(subexpression[1]: sqrt(x ^ 2.0 + y ^ 2.0), NonlinearExpression[subexpression[2]: log(4.0) - z[1], subexpression[3]: log(5.0) - z[2]])
@NLobjective(model, sense, expression)

Add a nonlinear objective to model with optimization sense sense. sense must be Max or Min.

Совместимость

This macro is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling. In most cases, you can replace @NLobjective with @objective.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> @NLobjective(model, Max, 2x + 1 + sin(x))

julia> print(model)
Max 2.0 * x + 1.0 + sin(x)
Subject to
@NLparameter(model, param == value)

Create and return a nonlinear parameter param attached to the model model with initial value set to value. Nonlinear parameters may be used only in nonlinear expressions.

Example

julia> model = Model();

julia> @NLparameter(model, x == 10)
x == 10.0

julia> value(x)
10.0
@NLparameter(model, value = param_value)

Create and return an anonymous nonlinear parameter param attached to the model model with initial value set to param_value. Nonlinear parameters may be used only in nonlinear expressions.

Example

julia> model = Model();

julia> x = @NLparameter(model, value = 10)
parameter[1] == 10.0

julia> value(x)
10.0
@NLparameter(model, param_collection[...] == value_expr)

Create and return a collection of nonlinear parameters param_collection attached to the model model with initial value set to value_expr (may depend on index sets). Uses the same syntax for specifying index sets as @variable.

Example

julia> model = Model();

julia> @NLparameter(model, y[i = 1:3] == 2 * i)
3-element Vector{NonlinearParameter}:
 parameter[1] == 2.0
 parameter[2] == 4.0
 parameter[3] == 6.0

julia> value(y[2])
4.0
@NLparameter(model, [...] == value_expr)

Create and return an anonymous collection of nonlinear parameters attached to the model model with initial value set to value_expr (may depend on index sets). Uses the same syntax for specifying index sets as @variable.

Совместимость

This macro is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling. In most cases, you can replace a call like @NLparameter(model, p == value) with @variable(model, p in Parameter(value)).

Example

julia> model = Model();

julia> y = @NLparameter(model, [i = 1:3] == 2 * i)
3-element Vector{NonlinearParameter}:
 parameter[1] == 2.0
 parameter[2] == 4.0
 parameter[3] == 6.0

julia> value(y[2])
4.0
 @NLparameters(model, args...)

Create and return multiple nonlinear parameters attached to model model, in the same fashion as @NLparameter macro.

The model must be the first argument, and multiple parameters can be added on multiple lines wrapped in a begin ... end block. Distinct parameters need to be placed on separate lines as in the following example.

The macro returns a tuple containing the parameters that were defined.

Совместимость

This macro is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling. In most cases, you can replace a call like

@NLparameters(model, begin
    p == value
end)

with

@variables(model, begin
    p in Parameter(value)
end)

Example

julia> model = Model();

julia> @NLparameters(model, begin
           x == 10
           b == 156
       end);

julia> value(x)
10.0
@build_constraint(constraint_expr)

Constructs a ScalarConstraint or VectorConstraint using the same machinery as @constraint but without adding the constraint to a model.

Constraints using broadcast operators like x .<= 1 are also supported and will create arrays of ScalarConstraint or VectorConstraint.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @build_constraint(2x >= 1)
ScalarConstraint{AffExpr, MathOptInterface.GreaterThan{Float64}}(2 x, MathOptInterface.GreaterThan{Float64}(1.0))
julia> model = Model();

julia> @variable(model, x[1:2]);

julia> @build_constraint(x .>= 0)
2-element Vector{ScalarConstraint{AffExpr, MathOptInterface.GreaterThan{Float64}}}:
 ScalarConstraint{AffExpr, MathOptInterface.GreaterThan{Float64}}(x[1], MathOptInterface.GreaterThan{Float64}(-0.0))
 ScalarConstraint{AffExpr, MathOptInterface.GreaterThan{Float64}}(x[2], MathOptInterface.GreaterThan{Float64}(-0.0))
@constraint(model, expr, args...; kwargs...)
@constraint(model, [index_sets...], expr, args...; kwargs...)
@constraint(model, name, expr, args...; kwargs...)
@constraint(model, name[index_sets...], expr, args...; kwargs...)

Add a constraint described by the expression expr.

The name argument is optional. If index sets are passed, a container is built and the constraint may depend on the indices of the index sets.

The expression expr may be one of following forms:

  • func in set, constraining the function func to belong to the set set, which is either a MOI.AbstractSet or one of the JuMP shortcuts like SecondOrderCone or PSDCone

  • a <op> b, where <op> is one of ==, , >=, , <=

  • l <= f <= u or u >= f >= l, constraining the expression f to lie between l and u

  • f(x) ⟂ x, which defines a complementarity constraint

  • z --> {expr}, which defines an indicator constraint that activates when z is 1

  • !z --> {expr}, which defines an indicator constraint that activates when z is 0

  • z <--> {expr}, which defines a reified constraint

  • expr := rhs, which defines a Boolean equality constraint

Broadcasted comparison operators like .== are also supported for the case when the left- and right-hand sides of the comparison operator are arrays.

JuMP extensions may additionally provide support for constraint expressions which are not listed here.

Keyword arguments

  • base_name: sets the name prefix used to generate constraint names. It corresponds to the constraint name for scalar constraints, otherwise, the constraint names are set to base_name[...] for each index ....

  • container = :Auto: force the container type by passing container = Array,

container = DenseAxisArray, container = SparseAxisArray, or any another container type which is supported by a JuMP extension.

  • set_string_name::Bool = true: control whether to set the MOI.ConstraintName attribute. Passing set_string_name = false can improve performance.

Other keyword arguments may be supported by JuMP extensions.

Example

julia> model = Model();

julia> @variable(model, x[1:3]);

julia> @variable(model, z, Bin);

julia> @constraint(model, x in SecondOrderCone())
[x[1], x[2], x[3]] ∈ MathOptInterface.SecondOrderCone(3)

julia> @constraint(model, [i in 1:3], x[i] == i)
3-element Vector{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.EqualTo{Float64}}, ScalarShape}}:
 x[1] = 1
 x[2] = 2
 x[3] = 3

julia> @constraint(model, x .== [1, 2, 3])
3-element Vector{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.EqualTo{Float64}}, ScalarShape}}:
 x[1] = 1
 x[2] = 2
 x[3] = 3

julia> @constraint(model, con_name, 1 <= x[1] + x[2] <= 3)
con_name : x[1] + x[2] ∈ [1, 3]

julia> @constraint(model, con_perp[i in 1:3], x[i] - 1 ⟂ x[i])
3-element Vector{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.VectorAffineFunction{Float64}, MathOptInterface.Complements}, VectorShape}}:
 con_perp[1] : [x[1] - 1, x[1]] ∈ MathOptInterface.Complements(2)
 con_perp[2] : [x[2] - 1, x[2]] ∈ MathOptInterface.Complements(2)
 con_perp[3] : [x[3] - 1, x[3]] ∈ MathOptInterface.Complements(2)

julia> @constraint(model, z --> {x[1] >= 0})
z --> {x[1] ≥ 0}

julia> @constraint(model, !z --> {2 * x[2] <= 3})
!z --> {2 x[2] ≤ 3}
@constraints(model, args...)

Adds groups of constraints at once, in the same fashion as the @constraint macro.

The model must be the first argument, and multiple constraints can be added on multiple lines wrapped in a begin ... end block.

The macro returns a tuple containing the constraints that were defined.

Example

julia> model = Model();

julia> @variable(model, w);

julia> @variable(model, x);

julia> @variable(model, y);

julia> @variable(model, z[1:3]);

julia> @constraints(model, begin
           x >= 1
           y - w <= 2
           sum_to_one[i=1:3], z[i] + y == 1
       end);

julia> print(model)
Feasibility
Subject to
 sum_to_one[1] : y + z[1] = 1
 sum_to_one[2] : y + z[2] = 1
 sum_to_one[3] : y + z[3] = 1
 x ≥ 1
 -w + y ≤ 2
@expression(model::GenericModel, expression)
@expression(model::GenericModel, [index_sets...], expression)
@expression(model::GenericModel, name, expression)
@expression(model::GenericModel, name[index_sets...], expression)

Efficiently builds and returns an expression.

The name argument is optional. If index sets are passed, a container is built and the expression may depend on the indices of the index sets.

Keyword arguments

  • container = :Auto: force the container type by passing container = Array, container = DenseAxisArray, container = SparseAxisArray, or any another container type which is supported by a JuMP extension.

Example

julia> model = Model();

julia> @variable(model, x[1:5]);

julia> @expression(model, shared, sum(i * x[i] for i in 1:5))
x[1] + 2 x[2] + 3 x[3] + 4 x[4] + 5 x[5]

julia> shared
x[1] + 2 x[2] + 3 x[3] + 4 x[4] + 5 x[5]

In the same way as @variable, the second argument may define index sets, and those indices can be used in the construction of the expressions:

julia> model = Model();

julia> @variable(model, x[1:3]);

julia> @expression(model, expr[i = 1:3], i * sum(x[j] for j in 1:3))
3-element Vector{AffExpr}:
 x[1] + x[2] + x[3]
 2 x[1] + 2 x[2] + 2 x[3]
 3 x[1] + 3 x[2] + 3 x[3]

Anonymous syntax is also supported:

julia> model = Model();

julia> @variable(model, x[1:3]);

julia> expr = @expression(model, [i in 1:3], i * sum(x[j] for j in 1:3))
3-element Vector{AffExpr}:
 x[1] + x[2] + x[3]
 2 x[1] + 2 x[2] + 2 x[3]
 3 x[1] + 3 x[2] + 3 x[3]
@expressions(model, args...)

Adds multiple expressions to model at once, in the same fashion as the @expression macro.

The model must be the first argument, and multiple expressions can be added on multiple lines wrapped in a begin ... end block.

The macro returns a tuple containing the expressions that were defined.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @variable(model, y);

julia> @variable(model, z[1:2]);

julia> a = [4, 5];

julia> @expressions(model, begin
           my_expr, x^2 + y^2
           my_expr_1[i = 1:2], a[i] - z[i]
       end)
(x² + y², AffExpr[-z[1] + 4, -z[2] + 5])
@force_nonlinear(expr)

Change the parsing of expr to construct GenericNonlinearExpr instead of GenericAffExpr or GenericQuadExpr.

This macro works by walking expr and substituting all calls to +, -, *, /, and ^ in favor of ones that construct GenericNonlinearExpr.

This macro will error if the resulting expression does not produce a GenericNonlinearExpr because, for example, it is used on an expression that does not use the basic arithmetic operators.

When to use this macro

In most cases, you should not use this macro.

Use this macro only if the intended output type is a GenericNonlinearExpr and the regular macro calls destroy problem structure, or in rare cases, if the regular macro calls introduce a large amount of intermediate variables, for example, because they promote types to a common quadratic expression.

Example

Use-case one: preserve problem structure.

julia> model = Model();

julia> @variable(model, x);

julia> @expression(model, (x - 0.1)^2)
x² - 0.2 x + 0.010000000000000002

julia> @expression(model, @force_nonlinear((x - 0.1)^2))
(x - 0.1) ^ 2

julia> (x - 0.1)^2
x² - 0.2 x + 0.010000000000000002

julia> @force_nonlinear((x - 0.1)^2)
(x - 0.1) ^ 2

Use-case two: reduce allocations

In this example, we know that x * 2.0 * (1 + x) * x is going to construct a nonlinear expression.

However, the default parsing first constructs:

In contrast, the modified parsing constructs:

This results in significantly fewer allocations.

julia> model = Model();

julia> @variable(model, x);

julia> @expression(model, x * 2.0 * (1 + x) * x)
(2 x² + 2 x) * x

julia> @expression(model, @force_nonlinear(x * 2.0 * (1 + x) * x))
x * 2.0 * (1 + x) * x

julia> @allocated @expression(model, x * 2.0 * (1 + x) * x)
3200

julia> @allocated @expression(model, @force_nonlinear(x * 2.0 * (1 + x) * x))
640
@objective(model::GenericModel, sense, func)

Set the objective sense to sense and objective function to func.

The objective sense can be either Min, Max, MOI.MIN_SENSE, MOI.MAX_SENSE or MOI.FEASIBILITY_SENSE. In order to set the sense programmatically, that is, when sense is a variable whose value is the sense, one of the three MOI.OptimizationSense values must be used.

Example

Minimize the value of the variable x, do:

julia> model = Model();

julia> @variable(model, x)
x

julia> @objective(model, Min, x)
x

Maximize the value of the affine expression 2x - 1:

julia> model = Model();

julia> @variable(model, x)
x

julia> @objective(model, Max, 2x - 1)
2 x - 1

Set the objective sense programmatically:

julia> model = Model();

julia> @variable(model, x)
x

julia> sense = MIN_SENSE
MIN_SENSE::OptimizationSense = 0

julia> @objective(model, sense, x^2 - 2x + 1)
x² - 2 x + 1
@operator(model, operator, dim, f[, ∇f[, ∇²f]])

Add the nonlinear operator operator in model with dim arguments, and create a new NonlinearOperator object called operator in the current scope.

The function f evaluates the operator and must return a scalar.

The optional function ∇f evaluates the first derivative, and the optional function ∇²f evaluates the second derivative.

∇²f may be provided only if ∇f is also provided.

Univariate syntax

If dim == 1, then the method signatures of each function must be:

  • f(::T)::T where {T<:Real}

  • ∇f(::T)::T where {T<:Real}

  • ∇²f(::T)::T where {T<:Real}

Multivariate syntax

If dim > 1, then the method signatures of each function must be:

  • f(x::T...)::T where {T<:Real}

  • ∇f(g::AbstractVector{T}, x::T...)::Nothing where {T<:Real}

  • ∇²f(H::AbstractMatrix{T}, x::T...)::Nothing where {T<:Real}

Where the gradient vector g and Hessian matrix H are filled in-place. For the Hessian, you must fill in the non-zero lower-triangular entries only. Setting an off-diagonal upper-triangular element may error.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> f(x::Float64) = x^2
f (generic function with 1 method)

julia> ∇f(x::Float64) = 2 * x
∇f (generic function with 1 method)

julia> ∇²f(x::Float64) = 2.0
∇²f (generic function with 1 method)

julia> @operator(model, op_f, 1, f, ∇f, ∇²f)
NonlinearOperator(f, :op_f)

julia> @objective(model, Min, op_f(x))
op_f(x)

julia> op_f(2.0)
4.0

julia> model[:op_f]
NonlinearOperator(f, :op_f)

julia> model[:op_f](x)
op_f(x)

Non-macro version

This macro is provided as helpful syntax that matches the style of the rest of the JuMP macros. However, you may also add operators outside the macro using add_nonlinear_operator. For example:

julia> model = Model();

julia> f(x) = x^2
f (generic function with 1 method)

julia> @operator(model, op_f, 1, f)
NonlinearOperator(f, :op_f)

is equivalent to

julia> model = Model();

julia> f(x) = x^2
f (generic function with 1 method)

julia> op_f = model[:op_f] = add_nonlinear_operator(model, 1, f; name = :op_f)
NonlinearOperator(f, :op_f)
@variable(model, expr, args..., kw_args...)

Add a variable to the model model described by the expression expr, the positional arguments args and the keyword arguments kw_args.

Anonymous and named variables

expr must be one of the forms:

  • Omitted, like @variable(model), which creates an anonymous variable

  • A single symbol like @variable(model, x)

  • A container expression like @variable(model, x[i=1:3])

  • An anonymous container expression like @variable(model, [i=1:3])

Bounds

In addition, the expression can have bounds, such as:

  • @variable(model, x >= 0)

  • @variable(model, x <= 0)

  • @variable(model, x == 0)

  • @variable(model, 0 <= x <= 1)

and bounds can depend on the indices of the container expressions:

  • @variable(model, -i <= x[i=1:3] <= i)

Sets

You can explicitly specify the set to which the variable belongs:

  • @variable(model, x in MOI.Interval(0.0, 1.0))

For more information on this syntax, read Variables constrained on creation.

Positional arguments

The recognized positional arguments in args are the following:

  • Bin: restricts the variable to the MOI.ZeroOne set, that is, {0, 1}. For example, @variable(model, x, Bin). Note: you cannot use @variable(model, Bin), use the binary keyword instead.

  • Int: restricts the variable to the set of integers, that is, …​, -2, -1, 0, 1, 2, …​ For example, @variable(model, x, Int). Note: you cannot use @variable(model, Int), use the integer keyword instead.

  • Symmetric: Only available when creating a square matrix of variables, that is when expr is of the form varname[1:n,1:n] or varname[i=1:n,j=1:n], it creates a symmetric matrix of variables.

  • PSD: A restrictive extension to Symmetric which constraints a square matrix of variables to Symmetric and constrains to be positive semidefinite.

Keyword arguments

Four keyword arguments are useful in all cases:

  • base_name: Sets the name prefix used to generate variable names. It corresponds to the variable name for scalar variable, otherwise, the variable names are set to base_name[...] for each index ... of the axes axes.

  • start::Float64: specify the value passed to set_start_value for each variable

  • container: specify the container type. See Forcing the container type for more information.

  • set_string_name::Bool = true: control whether to set the MOI.VariableName attribute. Passing set_string_name = false can improve performance.

Other keyword arguments are needed to disambiguate sitations with anonymous variables:

  • lower_bound::Float64: an alternative to x >= lb, sets the value of the variable lower bound.

  • upper_bound::Float64: an alternative to x <= ub, sets the value of the variable upper bound.

  • binary::Bool: an alternative to passing Bin, sets whether the variable is binary or not.

  • integer::Bool: an alternative to passing Int, sets whether the variable is integer or not.

  • set::MOI.AbstractSet: an alternative to using x in set

  • variable_type: used by JuMP extensions. See Extend @variable for more information.

Example

The following are equivalent ways of creating a variable x of name x with lower bound 0:

julia> model = Model();

julia> @variable(model, x >= 0)
x
julia> model = Model();

julia> @variable(model, x, lower_bound = 0)
x
julia> model = Model();

julia> x = @variable(model, base_name = "x", lower_bound = 0)
x

Other examples:

julia> model = Model();

julia> @variable(model, x[i=1:3] <= i, Int, start = sqrt(i), lower_bound = -i)
3-element Vector{VariableRef}:
 x[1]
 x[2]
 x[3]

julia> @variable(model, y[i=1:3], container = DenseAxisArray, set = MOI.ZeroOne())
1-dimensional DenseAxisArray{VariableRef,1,...} with index sets:
    Dimension 1, Base.OneTo(3)
And data, a 3-element Vector{VariableRef}:
 y[1]
 y[2]
 y[3]

julia> @variable(model, z[i=1:3], set_string_name = false)
3-element Vector{VariableRef}:
 _[7]
 _[8]
 _[9]
@variables(model, args...)

Adds multiple variables to model at once, in the same fashion as the @variable macro.

The model must be the first argument, and multiple variables can be added on multiple lines wrapped in a begin ... end block.

The macro returns a tuple containing the variables that were defined.

Example

julia> model = Model();

julia> @variables(model, begin
           x
           y[i = 1:2] >= 0, (start = i)
           z, Bin, (start = 0, base_name = "Z")
       end)
(x, VariableRef[y[1], y[2]], Z)

Keyword arguments must be contained within parentheses (refer to the example above).

TerminationStatusCode

An Enum of possible values for the TerminationStatus attribute. This attribute is meant to explain the reason why the optimizer stopped executing in the most recent call to optimize!.

Values

Possible values are:

  • OPTIMIZE_NOT_CALLED: The algorithm has not started.

  • OPTIMAL: The algorithm found a globally optimal solution.

  • INFEASIBLE: The algorithm concluded that no feasible solution exists.

  • DUAL_INFEASIBLE: The algorithm concluded that no dual bound exists for the problem. If, additionally, a feasible (primal) solution is known to exist, this status typically implies that the problem is unbounded, with some technical exceptions.

  • LOCALLY_SOLVED: The algorithm converged to a stationary point, local optimal solution, could not find directions for improvement, or otherwise completed its search without global guarantees.

  • LOCALLY_INFEASIBLE: The algorithm converged to an infeasible point or otherwise completed its search without finding a feasible solution, without guarantees that no feasible solution exists.

  • INFEASIBLE_OR_UNBOUNDED: The algorithm stopped because it decided that the problem is infeasible or unbounded; this occasionally happens during MIP presolve.

  • ALMOST_OPTIMAL: The algorithm found a globally optimal solution to relaxed tolerances.

  • ALMOST_INFEASIBLE: The algorithm concluded that no feasible solution exists within relaxed tolerances.

  • ALMOST_DUAL_INFEASIBLE: The algorithm concluded that no dual bound exists for the problem within relaxed tolerances.

  • ALMOST_LOCALLY_SOLVED: The algorithm converged to a stationary point, local optimal solution, or could not find directions for improvement within relaxed tolerances.

  • ITERATION_LIMIT: An iterative algorithm stopped after conducting the maximum number of iterations.

  • TIME_LIMIT: The algorithm stopped after a user-specified computation time.

  • NODE_LIMIT: A branch-and-bound algorithm stopped because it explored a maximum number of nodes in the branch-and-bound tree.

  • SOLUTION_LIMIT: The algorithm stopped because it found the required number of solutions. This is often used in MIPs to get the solver to return the first feasible solution it encounters.

  • MEMORY_LIMIT: The algorithm stopped because it ran out of memory.

  • OBJECTIVE_LIMIT: The algorithm stopped because it found a solution better than a minimum limit set by the user.

  • NORM_LIMIT: The algorithm stopped because the norm of an iterate became too large.

  • OTHER_LIMIT: The algorithm stopped due to a limit not covered by one of the LIMIT statuses above.

  • SLOW_PROGRESS: The algorithm stopped because it was unable to continue making progress towards the solution.

  • NUMERICAL_ERROR: The algorithm stopped because it encountered unrecoverable numerical error.

  • INVALID_MODEL: The algorithm stopped because the model is invalid.

  • INVALID_OPTION: The algorithm stopped because it was provided an invalid option.

  • INTERRUPTED: The algorithm stopped because of an interrupt signal.

  • OTHER_ERROR: The algorithm stopped because of an error not covered by one of the statuses defined above.

ResultStatusCode

An Enum of possible values for the PrimalStatus and DualStatus attributes.

The values indicate how to interpret the result vector.

Values

Possible values are:

  • NO_SOLUTION: the result vector is empty.

  • FEASIBLE_POINT: the result vector is a feasible point.

  • NEARLY_FEASIBLE_POINT: the result vector is feasible if some constraint tolerances are relaxed.

  • INFEASIBLE_POINT: the result vector is an infeasible point.

  • INFEASIBILITY_CERTIFICATE: the result vector is an infeasibility certificate. If the PrimalStatus is INFEASIBILITY_CERTIFICATE, then the primal result vector is a certificate of dual infeasibility. If the DualStatus is INFEASIBILITY_CERTIFICATE, then the dual result vector is a proof of primal infeasibility.

  • NEARLY_INFEASIBILITY_CERTIFICATE: the result satisfies a relaxed criterion for a certificate of infeasibility.

  • REDUCTION_CERTIFICATE: the result vector is an ill-posed certificate; see this article for details. If the PrimalStatus is REDUCTION_CERTIFICATE, then the primal result vector is a proof that the dual problem is ill-posed. If the DualStatus is REDUCTION_CERTIFICATE, then the dual result vector is a proof that the primal is ill-posed.

  • NEARLY_REDUCTION_CERTIFICATE: the result satisfies a relaxed criterion for an ill-posed certificate.

  • UNKNOWN_RESULT_STATUS: the result vector contains a solution with an unknown interpretation.

  • OTHER_RESULT_STATUS: the result vector contains a solution with an interpretation not covered by one of the statuses defined above

OptimizationSense

An enum for the value of the ObjectiveSense attribute.

Values

Possible values are:

  • MIN_SENSE: the goal is to minimize the objective function

  • MAX_SENSE: the goal is to maximize the objective function

  • FEASIBILITY_SENSE: the model does not have an objective function

GenericVariableRef{T} <: AbstractVariableRef

Holds a reference to the model and the corresponding MOI.VariableIndex.