API Help
#
JuMP.JuMP
— Module
JuMP
The algebraic modeling language for Julia.
For more information, see the website https://jump.dev .
#
JuMP.ALMOST_DUAL_INFEASIBLE
— Constant
ALMOST_DUAL_INFEASIBLE::TerminationStatusCode
Enumeration instance TerminationStatusCode
.
ALMOST_DUAL_INFEASIBLE
: The algorithm concluded that there is no double limitation for the problem within non-strict tolerances.
#
JuMP.ALMOST_INFEASIBLE
— Constant
ALMOST_INFEASIBLE::TerminationStatusCode
Enumeration instance TerminationStatusCode
.
ALMOST_INFEASIBLE
: The algorithm concluded that there is no feasible solution within non-strict tolerances.
#
JuMP.ALMOST_LOCALLY_SOLVED
— Constant
ALMOST_LOCALLY_SOLVED::TerminationStatusCode
Enumeration instance TerminationStatusCode
.
ALMOST_LOCALLY_SOLVED
: the algorithm converged to a stationary point, a local optimal solution, or it failed to find directions for improvement within non-strict tolerances.
#
JuMP.ALMOST_OPTIMAL
— Constant
ALMOST_OPTIMAL::TerminationStatusCode
Enumeration instance TerminationStatusCode
.
ALMOST_OPTIMAL
: The algorithm has found the global optimal solution in non-strict tolerances.
#
JuMP.AUTOMATIC
— Constant
The moi_backend
field contains a CachingOptimizer in AUTOMATIC mode.
#
JuMP.DIRECT
— Constant
The moi_backend
field contains an AbstractOptimizer. An additional copy of the model is not saved. 'moi_backend` should support add_constraint', etc
#
JuMP.DUAL_INFEASIBLE
— Constant
DUAL_INFEASIBLE::TerminationStatusCode
Enumeration instance TerminationStatusCode
.
DUAL_INFEASIBLE
: The algorithm has concluded that there is no double constraint for the problem. If, in addition, it is known about the existence of a feasible (direct) solution, this status usually indicates that the problem is unlimited with some technical exceptions.
#
JuMP.FEASIBILITY_SENSE
— Constant
FEASIBILITY_SENSE::OptimizationSense
Enumeration instance OptimizationSense
.
'FEASIBILITY_SENSE': The model does not have a target function.
#
JuMP.FEASIBLE_POINT
— Constant
FEASIBLE_POINT::ResultStatusCode
Enumeration instance ResultStatusCode
.
'FEASIBLE_POINT': The resulting vector represents a valid point.
#
JuMP.INFEASIBILITY_CERTIFICATE
— Constant
INFEASIBILITY_CERTIFICATE::ResultStatusCode
Enumeration instance ResultStatusCode
.
INFEASIBILITY_CERTIFICATE': The resulting vector represents a certificate of inadmissibility. If `PrimalStatus
has the value INFEASIBILITY_CERTIFICATE', the forward resulting vector represents evidence of dual inadmissibility. If `DualStatus
has the value `INFEASIBILITY_CERTIFICATE', the dual resulting vector is a proof of direct inadmissibility.
#
JuMP.INFEASIBLE
— Constant
INFEASIBLE::TerminationStatusCode
Enumeration instance TerminationStatusCode
.
`INFEASIBLE': The algorithm has concluded that there is no feasible solution.
#
JuMP.INFEASIBLE_OR_UNBOUNDED
— Constant
INFEASIBLE_OR_UNBOUNDED::TerminationStatusCode
Enumeration instance TerminationStatusCode
.
`INFEASIBLE_OR_UNBOUNDED': The algorithm has stopped execution because it has decided that the problem is unsolvable or unlimited; this sometimes happens during the preliminary decision of the MIP.
#
JuMP.INFEASIBLE_POINT
— Constant
INFEASIBLE_POINT::ResultStatusCode
Enumeration instance ResultStatusCode
.
`INFEASIBLE_POINT': The resulting vector represents an invalid point.
#
JuMP.INTERRUPTED
— Constant
INTERRUPTED::TerminationStatusCode
Enumeration instance TerminationStatusCode
.
INTERRUPTED
: The algorithm stopped execution due to an interrupt signal.
#
JuMP.INVALID_MODEL
— Constant
INVALID_MODEL::TerminationStatusCode
Enumeration instance TerminationStatusCode
.
`INVALID_MODEL': The algorithm has stopped execution because the model is invalid.
#
JuMP.INVALID_OPTION
— Constant
INVALID_OPTION::TerminationStatusCode
Enumeration instance TerminationStatusCode
.
`INVALID_OPTION': The algorithm stopped execution because it was provided with an invalid parameter.
#
JuMP.ITERATION_LIMIT
— Constant
ITERATION_LIMIT::TerminationStatusCode
Enumeration instance TerminationStatusCode
.
`ITERATION_LIMIT': The iterative algorithm stopped after completing the maximum number of iterations.
#
JuMP.LOCALLY_INFEASIBLE
— Constant
LOCALLY_INFEASIBLE::TerminationStatusCode
Enumeration instance TerminationStatusCode
.
LOCALLY_INFEASIBLE
: the algorithm converged to an invalid point or otherwise completed the search without finding a valid solution, but without guarantees that a valid solution does not exist.
#
JuMP.LOCALLY_SOLVED
— Constant
LOCALLY_SOLVED::TerminationStatusCode
Enumeration instance TerminationStatusCode
.
LOCALLY_SOLVED
: the algorithm converged to a stationary point, a local optimal solution, did not find a direction for improvement, or otherwise completed the search without guarantees of a global solution.
#
JuMP.MANUAL
— Constant
The moi_backend
field contains a CachingOptimizer in MANUAL mode.
#
JuMP.MAX_SENSE
— Constant
MAX_SENSE::OptimizationSense
Enumeration instance OptimizationSense
.
MAX_SENSE
: The goal is to maximize the objective function.
#
JuMP.MEMORY_LIMIT
— Constant
MEMORY_LIMIT::TerminationStatusCode
Enumeration instance TerminationStatusCode
.
`MEMORY_LIMIT': The algorithm stopped execution because it ran out of memory.
#
JuMP.MIN_SENSE
— Constant
MIN_SENSE::OptimizationSense
Enumeration instance OptimizationSense
.
MIN_SENSE
: The goal is to minimize the objective function.
#
JuMP.NEARLY_FEASIBLE_POINT
— Constant
NEARLY_FEASIBLE_POINT::ResultStatusCode
Enumeration instance ResultStatusCode
.
NEAR_FEASIBLE_POINT
: The resulting vector is valid if some of the constraints' tolerances are not strict.
#
JuMP.NEARLY_INFEASIBILITY_CERTIFICATE
— Constant
NEARLY_INFEASIBILITY_CERTIFICATE::ResultStatusCode
Enumeration instance ResultStatusCode
.
NEAR_INFEASIBILITY_CERTIFICATE
: the result meets the non-strict criterion of the certificate of inadmissibility.
#
JuMP.NEARLY_REDUCTION_CERTIFICATE
— Constant
NEARLY_REDUCTION_CERTIFICATE::ResultStatusCode
Enumeration instance ResultStatusCode
.
NEARLY_REDUCTION_CERTIFICATE
: the result meets the non-strict criterion of evidence of incorrectness.
#
JuMP.NODE_LIMIT
— Constant
NODE_LIMIT::TerminationStatusCode
Enumeration instance TerminationStatusCode
.
`NODE_LIMIT': the branches and boundaries algorithm stopped execution because it examined the maximum number of nodes in the tree of branches and boundaries.
#
JuMP.NORM_LIMIT
— Constant
NORM_LIMIT::TerminationStatusCode
Enumeration instance TerminationStatusCode
.
`NORM_LIMIT': The algorithm has stopped execution because the iteration rate has become too large.
#
JuMP.NO_SOLUTION
— Constant
NO_SOLUTION::ResultStatusCode
Enumeration instance ResultStatusCode
.
`NO_SOLUTION': the resulting vector is empty.
#
JuMP.NUMERICAL_ERROR
— Constant
NUMERICAL_ERROR::TerminationStatusCode
Enumeration instance TerminationStatusCode
.
`NUMERICAL_ERROR': The algorithm stopped execution because an unavoidable numerical error occurred.
#
JuMP.OBJECTIVE_LIMIT
— Constant
OBJECTIVE_LIMIT::TerminationStatusCode
Enumeration instance TerminationStatusCode
.
`OBJECTIVE_LIMIT': The algorithm stopped execution because it found a solution better than the minimum limit set by the user.
#
JuMP.OPTIMAL
— Constant
OPTIMAL::TerminationStatusCode
Enumeration instance TerminationStatusCode
.
OPTIMAL
: The algorithm has found the global optimal solution.
#
JuMP.OPTIMIZE_NOT_CALLED
— Constant
OPTIMIZE_NOT_CALLED::TerminationStatusCode
Enumeration instance TerminationStatusCode
.
`OPTIMIZE_NOT_CALLED': the algorithm was not started.
#
JuMP.OTHER_ERROR
— Constant
OTHER_ERROR::TerminationStatusCode
Enumeration instance TerminationStatusCode
.
`OTHER_ERROR': The algorithm stopped execution due to an error not described by one of the statuses listed above.
#
JuMP.OTHER_LIMIT
— Constant
OTHER_LIMIT::TerminationStatusCode
Enumeration instance TerminationStatusCode
.
OTHER_LIMIT': The algorithm stopped execution due to a limit not described by one of the `LIMIT
statuses listed above.
#
JuMP.OTHER_RESULT_STATUS
— Constant
OTHER_RESULT_STATUS::ResultStatusCode
Enumeration instance ResultStatusCode
.
`OTHER_RESULT_STATUS': The resulting vector contains a solution with an interpretation not covered by any of the statuses defined above.
#
JuMP.REDUCTION_CERTIFICATE
— Constant
REDUCTION_CERTIFICATE::ResultStatusCode
Enumeration instance ResultStatusCode
.
REDUCTION_CERTIFICATE': The resulting vector represents evidence of incorrectness. For more information, see https://arxiv.org/abs/1408.4685 [this article]. If `PrimalStatus
has the value REDUCTION_CERTIFICATE
, the direct resulting vector is a proof of incorrect formulation of the dual problem. If DualStatus
has the value REDUCTION_CERTIFICATE
, the dual resulting vector is a proof of incorrect formulation of the direct problem.
#
JuMP.SLOW_PROGRESS
— Constant
SLOW_PROGRESS::TerminationStatusCode
Enumeration instance TerminationStatusCode
.
`SLOW_PROGRESS': The algorithm stopped execution because it could not continue moving towards a solution.
#
JuMP.SOLUTION_LIMIT
— Constant
SOLUTION_LIMIT::TerminationStatusCode
Enumeration instance TerminationStatusCode
.
`SOLUTION_LIMIT': The algorithm stopped execution because it found the required number of solutions. It is often used in MIP so that the solver returns the first valid solution found.
#
JuMP.TIME_LIMIT
— Constant
TIME_LIMIT::TerminationStatusCode
Enumeration instance TerminationStatusCode
.
`TIME_LIMIT': The algorithm stopped execution after the user-specified calculation time.
#
JuMP.UNKNOWN_RESULT_STATUS
— Constant
UNKNOWN_RESULT_STATUS::ResultStatusCode
Enumeration instance ResultStatusCode
.
`UNKNOWN_RESULT_STATUS': The resulting vector contains a solution with an unknown interpretation.
#
JuMP._CONSTRAINT_LIMIT_FOR_PRINTING
— _Constant
const _CONSTRAINT_LIMIT_FOR_PRINTING = Ref{Int}(100)
A global constant used to control when constraints are skipped during model output.
The value is returned and set using `_CONSTRAINT_LIMIT_FOR_PRINTING[]'.
julia> _CONSTRAINT_LIMIT_FOR_PRINTING[]
100
julia> _CONSTRAINT_LIMIT_FOR_PRINTING[] = 10
10
#
JuMP._TERM_LIMIT_FOR_PRINTING
— _Constant
const _TERM_LIMIT_FOR_PRINTING = Ref{Int}(60)
A global constant used to control when members are skipped when outputting expressions.
The value is returned and set using `_TERM_LIMIT_FOR_PRINTING[]'.
julia> _TERM_LIMIT_FOR_PRINTING[]
60
julia> _TERM_LIMIT_FOR_PRINTING[] = 10
10
#
JuMP.op_and
— Constant
op_and(x, y)
A function that is equal to x & y
by default, but returns when called with variables or JuMP expressions. GenericNonlinearExpr
.
Example
julia> model = Model();
julia> @variable(model, x);
julia> op_and(true, false)
false
julia> op_and(true, x)
true && x
#
JuMP.op_equal_to
— Constant
op_equal_to(x, y)
A function that defaults to x == y
, but returns when called with variables or JuMP expressions. GenericNonlinearExpr
.
Example
julia> model = Model();
julia> @variable(model, x);
julia> op_equal_to(2, 2)
true
julia> op_equal_to(x, 2)
x == 2
#
JuMP.op_greater_than_or_equal_to
— Constant
op_greater_than_or_equal_to(x, y)
The default function is x >=y
, but when called with variables or JuMP expressions, it returns GenericNonlinearExpr
.
Example
julia> model = Model();
julia> @variable(model, x);
julia> op_greater_than_or_equal_to(2, 2)
true
julia> op_greater_than_or_equal_to(x, 2)
x >= 2
#
JuMP.op_less_than_or_equal_to
— Constant
op_less_than_or_equal_to(x, y)
The default function is x <= y
, but when called with variables or JuMP expressions, it returns GenericNonlinearExpr
.
Example
julia> model = Model();
julia> @variable(model, x);
julia> op_less_than_or_equal_to(2, 2)
true
julia> op_less_than_or_equal_to(x, 2)
x <= 2
#
JuMP.op_or
— Constant
op_or(x, y)
A function that is equal to x|y
by default, but returns when called with variables or JuMP expressions. GenericNonlinearExpr
.
Example
julia> model = Model();
julia> @variable(model, x);
julia> op_or(true, false)
true
julia> op_or(true, x)
true || x
#
JuMP.op_strictly_greater_than
— Constant
op_strictly_greater_than(x, y)
A function that defaults to x > y
, but returns when called with variables or JuMP expressions. GenericNonlinearExpr
.
Example
julia> model = Model();
julia> @variable(model, x);
julia> op_strictly_greater_than(1, 2)
false
julia> op_strictly_greater_than(x, 2)
x > 2
#
JuMP.op_strictly_less_than
— Constant
op_strictly_less_than(x, y)
A function that defaults to x < y
, but returns when called with variables or JuMP expressions. GenericNonlinearExpr
.
Example
julia> model = Model();
julia> @variable(model, x);
julia> op_strictly_less_than(1, 2)
true
julia> op_strictly_less_than(x, 2)
x < 2
#
JuMP.AbstractConstraint
— Type
abstract type AbstractConstraint
An abstract base type for all types of constraints. The AbstractConstraint
objects directly store functions and sets, unlike objects. ConstraintRef
, which are just references to constraints stored in the model. The AbstractConstraint
objects do not need to be attached to the model.
#
JuMP.AbstractJuMPScalar
— Type
AbstractJuMPScalar <: MutableArithmetics.AbstractMutable
An abstract base type for all scalar types.
Creating subtypes of AbstractMutable
will allow redirecting calls to some Base
functions to a method in MA that handles type promotion more correctly (for example, promotion in sparse matrix products in SparseArrays usually does not work for JuMP types) and uses the variability of AffExpr
and `QuadExpr'.
#
JuMP.AbstractModel
— Type
AbstractModel
An abstract type whose subtypes should be created when creating JuMP extensions.
#
JuMP.AbstractShape
— Type
AbstractShape
An abstract vectorizable form. For a flat vector shape of an object having the shape of a shape', the original object can be obtained using `reshape_vector
.
#
JuMP.AbstractVariable
— Type
AbstractVariable
The variable returned build_variable
. Represents a variable that has not yet been added to any model. To the specified model model
you can add it using add_variable
.
#
JuMP.AbstractVariableRef
— Type
AbstractVariableRef
The variable returned add_variable
. With affine (or quadratic) operations with variables of type V<:AbstractVariableRef
and coefficients of type T
, a GenericAffExpr' is created.{T,V}
(or, respectively, GenericQuadExpr{T,V}
).
#
JuMP.AffExpr
— Type
AffExpr
An alias for GenericAffExpr{Float64,VariableRef}
, type options GenericAffExpr
, used in JuMP.
#
JuMP.ArrayShape
— Type
ArrayShape{N}(dims::NTuple{N,Int}) where {N}
An AbstractShape
that represents array-valued constraints.
Example
julia> model = Model();
julia> @variable(model, x[1:2, 1:3]);
julia> c = @constraint(model, x >= 0, Nonnegatives())
[x[1,1] x[1,2] x[1,3]
x[2,1] x[2,2] x[2,3]] ∈ Nonnegatives()
julia> shape(constraint_object(c))
ArrayShape{2}((2, 3))
#
JuMP.BridgeableConstraint
— Type
BridgeableConstraint(
constraint::C,
bridge_type::B;
coefficient_type::Type{T} = Float64,
) where {C<:AbstractConstraint,B<:Type{<:MOI.Bridges.AbstractBridge},T}
An object AbstractConstraint
, representing a constraint that can be connected by a bridge of the type bridge_type{coefficient_type}
.
Adding a BridgeableConstraint
to the model is equivalent to the following:
add_bridge(model, bridge_type; coefficient_type = coefficient_type)
add_constraint(model, constraint)
Example
For a new type of scalar set CustomSet' with a bridge `CustomBridge
that can connect constraints 'F` to CustomSet
when the user executes the following code:
model = Model()
@variable(model, x)
@constraint(model, x + 1 in CustomSet())
optimize!(model)
with an optimizer that does not support the constraints “F` in CustomSet”, the constraint will not be connected unless the `add_bridge(model, CustomBridge)
method is called first.
To automatically add a CustomBridge
to any model that has the constraint F
added to the CustomSet
, add the following method:
function JuMP.build_constraint(
error_fn::Function,
func::AbstractJuMPScalar,
set::CustomSet,
)
constraint = ScalarConstraint(func, set)
return BridgeableConstraint(constraint, CustomBridge)
end
Note
JuMP extensions should extend JuMP.build_constraint
only if they also define a CustomSet
, for three reasons:
-
If multiple extensions overload the same JuMP method, problems arise.
-
If there is no method, users will not be informed that they forgot to download
the extension module that defines the build_constraint
method.
-
Defining a method in which there is neither a function nor any of the argument types
not defined in the package, called https://docs.julialang.org/en/v1/manual/style-guide/index.html#Avoid-type-piracy-1 ["piracy of types"] and it is not welcome in the Julia style guide.
#
JuMP.ComplexPlane
— Type
ComplexPlane
An object of the complex plane that can be used to create a complex variable in a macro. @variable
.
Example
Consider the following example.
julia> model = Model();
julia> @variable(model, x in ComplexPlane())
real(x) + imag(x) im
julia> all_variables(model)
2-element Vector{VariableRef}:
real(x)
imag(x)
In the output of the last command, we see that two real variables have been created. The Julia variable x
is associated with an affine expression by means of these two variables, which parametrize the complex plane.
#
JuMP.ComplexVariable
— Type
ComplexVariable{S,T,U,V} <: AbstractVariable
The structure used when adding complex variables.
See also the type description ComplexPlane
.
#
JuMP.ConstraintNotOwned
— Type
struct ConstraintNotOwned{C<:ConstraintRef} <: Exception
constraint_ref::C
end
An error that occurs when using the constraint constraint_ref
in a model other than `owner_model(constraint_ref)'.
Example
julia> model = Model();
julia> @variable(model, x);
julia> @constraint(model, c, x >= 0)
c : x ≥ 0
julia> model_new = Model();
julia> MOI.get(model_new, MOI.ConstraintName(), c)
ERROR: ConstraintNotOwned{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.GreaterThan{Float64}}, ScalarShape}}(c : x ≥ 0)
Stacktrace:
[...]
#
JuMP.ConstraintRef
— Type
ConstraintRef
Contains a reference to the model and the corresponding MOI.ConstraintIndex.
#
JuMP.GenericAffExpr
— Type
mutable struct GenericAffExpr{CoefType,VarType} <: AbstractJuMPScalar
constant::CoefType
terms::OrderedDict{VarType,CoefType}
end
An expression type representing an affine expression of the following form: .
Fields
-
.constant
: constraint ofc
in the expression. -
.terms
: an 'OrderedDict` dictionary withVarType
type keys andCoefType
type values describing a sparse vector `a'.
Example
julia> model = Model();
julia> @variable(model, x[1:2]);
julia> expr = x[2] + 3.0 * x[1] + 4.0
x[2] + 3 x[1] + 4
julia> expr.constant
4.0
julia> expr.terms
OrderedCollections.OrderedDict{VariableRef, Float64} with 2 entries:
x[2] => 1.0
x[1] => 3.0
#
JuMP.GenericAffExpr
— Method
GenericAffExpr(constant::V, kv::Vararg{Pair{K,V},N}) where {K,V,N}
Creates GenericAffExpr
, passing a constant and a pair of additional arguments.
Example
julia> model = Model();
julia> @variable(model, x);
julia> GenericAffExpr(1.0, x => 1.0)
x + 1
#
JuMP.GenericAffExpr
— Method
GenericAffExpr(constant::V, kv::AbstractArray{Pair{K,V}}) where {K,V}
Creates GenericAffExpr
, passing a constant and a vector of pairs.
Example
julia> model = Model();
julia> @variable(model, x);
julia> GenericAffExpr(1.0, [x => 1.0])
x + 1
#
JuMP.GenericModel
— Method
GenericModel{T}(
[optimizer_factory;]
add_bridges::Bool = true,
) where {T<:Real}
Creates an instance of the JuMP model.
If the optimizer_factory
argument is specified, the model is initialized with the optimizer returned by `MOI.instantiate(optimizer_factory)'.
If the optimizer_factory
argument is not specified, use 'set_optimizer` to set the optimizer, before calling optimize!
.
If add_bridges
, JuMP adds `MOI.Bridges.LazyBridgeOptimizer' to automatically reformulate the task into a form supported by the optimizer.
Value type T
Passing a type other than Float64
as the value type T
is a more complex operation. The value type must match the one expected by the selected optimizer. For more information, see the documentation on optimizers.
Unless otherwise specified in the documentation, it is assumed that the optimizer supports only `Float64'.
When an unsupported value type is selected, the error MOI.UnsupportedConstraint
or MOI.UnsupportedAttribute
occurs, and the moment it occurs (during model construction or during invocation optimize!
) depends on how the solver interacts with JuMP.
Example
julia> model = GenericModel{BigFloat}();
julia> typeof(model)
GenericModel{BigFloat}
#
JuMP.GenericNonlinearExpr
— Type
GenericNonlinearExpr{V}(head::Symbol, args::Vector{Any})
GenericNonlinearExpr{V}(head::Symbol, args::Any...)
Nonlinear function head(args...)
with a scalar value, represented as a symbolic expression tree, with the call operator head
and ordered arguments in args
.
V' is the type `AbstractVariableRef
, present in the expression and used to dispatch JuMP extensions.
head
The 'head::Symbol` operator must be supported by the model.
The list of supported one-dimensional operators is contained by default in the following constant:
-
MOI.Nonlinear.DEFAULT_UNIVARIATE_OPERATORS
The list of supported multidimensional operators is contained by default in the following constant:
-
MOI.Nonlinear.DEFAULT_MULTIVARIATE_OPERATORS
Additional operators can be added using a macro. @operator
.
To view the full list of operators supported by `MOI.modelike', request the attribute `MOI.ListOfSupportedNonlinearOperators'.
args
The vector `args' contains the arguments of the nonlinear function. If the operator is one-dimensional, it must contain one element. Otherwise, it may contain multiple elements.
With a given subtype of type V
AbstractVariableRef
for GenericNonlinearExpr{V}
each element must be of one of the following types:
-
a constant value of the type
<:Real
; -
V
;
where T<:Real
and T == value_type(V)
.
Unsupported operators
If the optimizer does not support head
, the error `MOI.UnsupportedNonlinearOperator' is returned.
This error can occur at different times: when adding a function to the model for the first time or when calling optimize!
.
Example
To represent the function execute the following code:
julia> model = Model();
julia> @variable(model, x)
x
julia> f = sin(x)^2
sin(x) ^ 2.0
julia> f = GenericNonlinearExpr{VariableRef}(
:^,
GenericNonlinearExpr{VariableRef}(:sin, x),
2.0,
)
sin(x) ^ 2.0
#
JuMP.GenericQuadExpr
— Type
mutable struct GenericQuadExpr{CoefType,VarType} <: AbstractJuMPScalar
aff::GenericAffExpr{CoefType,VarType}
terms::OrderedDict{UnorderedPair{VarType}, CoefType}
end
An expression type representing a quadratic expression of the following form: .
Fields
-
.aff
: objectGenericAffExpr
, representing the affine part of the expression. -
.terms
: an 'OrderedDict` dictionary with keys likeUnorderedPair{VarType}
and values of type `CoefType', describing a sparse list of members of `q'.
Example
julia> model = Model();
julia> @variable(model, x[1:2]);
julia> expr = 2.0 * x[1]^2 + x[1] * x[2] + 3.0 * x[1] + 4.0
2 x[1]² + x[1]*x[2] + 3 x[1] + 4
julia> expr.aff
3 x[1] + 4
julia> expr.terms
OrderedCollections.OrderedDict{UnorderedPair{VariableRef}, Float64} with 2 entries:
UnorderedPair{VariableRef}(x[1], x[1]) => 2.0
UnorderedPair{VariableRef}(x[1], x[2]) => 1.0
#
JuMP.GenericQuadExpr
— Method
GenericQuadExpr(
aff::GenericAffExpr{V,K},
kv::AbstractArray{Pair{UnorderedPair{K},V}}
) where {K,V}
Creates GenericQuadExpr
, passing GenericAffExpr
and a vector of pairs (`UnorderedPair', coefficient).
Example
julia> model = Model();
julia> @variable(model, x);
julia> GenericQuadExpr(GenericAffExpr(1.0, x => 2.0), [UnorderedPair(x, x) => 3.0])
3 x² + 2 x + 1
#
JuMP.GenericReferenceMap
— Type
GenericReferenceMap{T}
The mapping between a variable and a reference to a constraint of the model and its copy. You can get a link to the copied model by accessing the index of the mapping with the corresponding link to the original model.
#
JuMP.GenericVariableRef
— Type
GenericVariableRef{T} <: AbstractVariableRef
Contains a reference to the model and the corresponding MOI.VariableIndex.
#
JuMP.GenericVariableRef
— Method
GenericVariableRef{T}(c::ConstraintRef)
Returns the variable associated with ConstraintRef
if c
is a constraint for one variable.
Example
julia> model = Model();
julia> @variable(model, x >= 0)
x
julia> c = LowerBoundRef(x)
x ≥ 0
julia> VariableRef(c) == x
true
#
JuMP.GreaterThanZero
— Type
GreaterThanZero()
A struct used to intercept when >=
or ≥
is used in a macro via operator_to_set
.
This struct is not the same as Nonnegatives
so that we can disambiguate x >= y
and x - y in Nonnegatives()
.
This struct is not intended for general usage, but it may be useful to some JuMP extensions.
Example
julia> operator_to_set(error, Val(:>=))
GreaterThanZero()
#
JuMP.HermitianMatrixAdjointShape
— Type
HermitianMatrixAdjointShape(side_dimension)
The dual_shape
of HermitianMatrixShape
.
This shape is not intended for regular use.
#
JuMP.HermitianMatrixShape
— Type
HermitianMatrixShape
A shape object for a Hermitian square matrix of `side_dimension' rows and columns. The vectorized form corresponds to `MOI.HermitianPositiveSemidefiniteConeTriangle'.
#
JuMP.HermitianMatrixSpace
— Type
HermitianMatrixSpace()
Used in a macro @variable
, limiting the matrix of variables to a Hermitian shape.
Example
julia> model = Model();
julia> @variable(model, Q[1:2, 1:2] in HermitianMatrixSpace())
2×2 LinearAlgebra.Hermitian{GenericAffExpr{ComplexF64, VariableRef}, Matrix{GenericAffExpr{ComplexF64, VariableRef}}}:
real(Q[1,1]) real(Q[1,2]) + imag(Q[1,2]) im
real(Q[1,2]) - imag(Q[1,2]) im real(Q[2,2])
#
JuMP.HermitianPSDCone
— Type
HermitianPSDCone
A Hermitian positively semi-definite cone object that can be used to create a Hermitian positively semi-definite square matrix in macros @variable
and @constraint
.
Example
Consider the following example.
julia> model = Model();
julia> @variable(model, H[1:3, 1:3] in HermitianPSDCone())
3×3 LinearAlgebra.Hermitian{GenericAffExpr{ComplexF64, VariableRef}, Matrix{GenericAffExpr{ComplexF64, VariableRef}}}:
real(H[1,1]) … real(H[1,3]) + imag(H[1,3]) im
real(H[1,2]) - imag(H[1,2]) im real(H[2,3]) + imag(H[2,3]) im
real(H[1,3]) - imag(H[1,3]) im real(H[3,3])
julia> all_variables(model)
9-element Vector{VariableRef}:
real(H[1,1])
real(H[1,2])
real(H[2,2])
real(H[1,3])
real(H[2,3])
real(H[3,3])
imag(H[1,2])
imag(H[1,3])
imag(H[2,3])
julia> all_constraints(model, Vector{VariableRef}, MOI.HermitianPositiveSemidefiniteConeTriangle)
1-element Vector{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.VectorOfVariables, MathOptInterface.HermitianPositiveSemidefiniteConeTriangle}}}:
[real(H[1,1]), real(H[1,2]), real(H[2,2]), real(H[1,3]), real(H[2,3]), real(H[3,3]), imag(H[1,2]), imag(H[1,3]), imag(H[2,3])] ∈ MathOptInterface.HermitianPositiveSemidefiniteConeTriangle(3)
In the output of the last commands, we see that nine real variables have been created. The matrix H restricts affine expressions to these nine variables, which parametrize the Hermitian matrix.
#
JuMP.LPMatrixData
— Type
LPMatrixData{T}
The structure returned by the method lp_matrix_data
. For a description of the public fields, see the section on lp_matrix_data
.
#
JuMP.LessThanZero
— Type
GreaterThanZero()
A struct used to intercept when <=
or ≤
is used in a macro via operator_to_set
.
This struct is not the same as Nonpositives
so that we can disambiguate x <= y
and x - y in Nonpositives()
.
This struct is not intended for general usage, but it may be useful to some JuMP extensions.
Example
julia> operator_to_set(error, Val(:<=))
LessThanZero()
#
JuMP.LinearTermIterator
— Type
LinearTermIterator{GAE<:GenericAffExpr}
A structure implementing the iterate
protocol for iterating over tuples of (efficient, variable)
in `GenericAffExpr'.
#
JuMP.Model
— Type
Model([optimizer_factory;] add_bridges::Bool = true)
Creates an instance of the JuMP model.
If the optimizer_factory
argument is specified, the model is initialized with the optimizer returned by `MOI.instantiate(optimizer_factory)'.
If the optimizer_factory
argument is not specified, use 'set_optimizer` to set the optimizer, before calling optimize!
.
If add_bridges
, JuMP adds `MOI.Bridges.LazyBridgeOptimizer' to automatically reformulate the task into a form supported by the optimizer.
Example
julia> import Ipopt
julia> model = Model(Ipopt.Optimizer);
julia> solver_name(model)
"Ipopt"
julia> import HiGHS
julia> import MultiObjectiveAlgorithms as MOA
julia> model = Model(() -> MOA.Optimizer(HiGHS.Optimizer); add_bridges = false);
#
JuMP.ModelMode
— Type
ModelMode
An enumeration for describing the state of the CachingOptimizer inside the JuMP model.
See also the description of the `mode' method.
Values
Possible values:
#
JuMP.NoOptimizer
— Type
struct NoOptimizer <: Exception end
If the required optimizer is not installed, an error is returned.
The optimizer can be passed either to the constructor. Model
, or by calling set_optimizer
.
Example
julia> model = Model();
julia> optimize!(model)
ERROR: NoOptimizer()
Stacktrace:
[...]
#
JuMP.NonlinearConstraintRef
— Type
NonlinearConstraintRef
Compatibility
This type is part of an outdated non-linear interface. It is recommended to use a new non-linear interface, which is described in the section Nonlinear Modeling. |
#
JuMP.NonlinearExpr
— Type
NonlinearExpr
An alias for GenericNonlinearExpr{VariableRef}
, type options GenericNonlinearExpr
, used in JuMP.
#
JuMP.NonlinearExpression
— Type
NonlinearExpression <: AbstractJuMPScalar
A structure representing a nonlinear expression.
The expression is created using a macro @NLexpression
.
Compatibility
This type is part of an outdated non-linear interface. It is recommended to use a new non-linear interface, which is described in the section Nonlinear Modeling. |
#
JuMP.NonlinearOperator
— Type
NonlinearOperator(func::Function, head::Symbol)
A called structure (a functor) representing a function named `head'.
When calling from AbstractJuMPScalar
this structure returns GenericNonlinearExpr
.
When called with non-JuMP types, it returns the result of the calculation func(args...)
.
If the head' function is not a special case for the optimizer, the operator should already be added to the model using `add_nonlinear_operator
or @operator
.
Example
julia> model = Model();
julia> @variable(model, x)
x
julia> f(x::Float64) = x^2
f (generic function with 1 method)
julia> ∇f(x::Float64) = 2 * x
∇f (generic function with 1 method)
julia> ∇²f(x::Float64) = 2.0
∇²f (generic function with 1 method)
julia> @operator(model, op_f, 1, f, ∇f, ∇²f)
NonlinearOperator(f, :op_f)
julia> bar = NonlinearOperator(f, :op_f)
NonlinearOperator(f, :op_f)
julia> @objective(model, Min, bar(x))
op_f(x)
julia> bar(2.0)
4.0
#
JuMP.NonlinearParameter
— Type
NonlinearParameter <: AbstractJuMPScalar
A structure representing a nonlinear parameter.
The parameter is created using a macro @NLparameter
.
Compatibility
This type is part of an outdated non-linear interface. It is recommended to use a new non-linear interface, which is described in the section Nonlinear Modeling. |
#
JuMP.Nonnegatives
— Type
Nonnegatives()
The equivalent of the set of MOI.Nonnegatives
in JuMP with the output of the measurement from the corresponding function.
Example
julia> model = Model();
julia> @variable(model, x[1:2])
2-element Vector{VariableRef}:
x[1]
x[2]
julia> @constraint(model, x in Nonnegatives())
[x[1], x[2]] ∈ MathOptInterface.Nonnegatives(2)
julia> A = [1 2; 3 4];
julia> b = [5, 6];
julia> @constraint(model, A * x >= b)
[x[1] + 2 x[2] - 5, 3 x[1] + 4 x[2] - 6] ∈ MathOptInterface.Nonnegatives(2)
#
JuMP.Nonpositives
— Type
Nonpositives()
The equivalent of the set of MOI.Nonpositives
in JuMP with the output of the measurement from the corresponding function.
Example
julia> model = Model();
julia> @variable(model, x[1:2])
2-element Vector{VariableRef}:
x[1]
x[2]
julia> @constraint(model, x in Nonpositives())
[x[1], x[2]] ∈ MathOptInterface.Nonpositives(2)
julia> A = [1 2; 3 4];
julia> b = [5, 6];
julia> @constraint(model, A * x <= b)
[x[1] + 2 x[2] - 5, 3 x[1] + 4 x[2] - 6] ∈ MathOptInterface.Nonpositives(2)
#
JuMP.OptimizeNotCalled
— Type
struct OptimizeNotCalled <: Exception end
If the result attribute cannot be requested before calling optimize!
, an error is returned.
Example
julia> import Ipopt
julia> model = Model(Ipopt.Optimizer);
julia> objective_value(model)
ERROR: OptimizeNotCalled()
Stacktrace:
[...]
#
JuMP.PSDCone
— Type
PSDCone
A positively semi-defined cone object that can be used to constrain a square matrix to a positively semi-defined shape in a macro @constraint
.
If the matrix is of type Symmetrical
, then the column vectorization (the vector obtained by column concatenation) of its upper triangular part must belong to the set MOI.PositiveSemidefiniteConeTriangle
; otherwise, its column vectorization must belong to the set `MOI.PositiveSemidefiniteConeSquare'.
Example
The asymmetrical case:
julia> model = Model();
julia> @variable(model, x);
julia> a = [x 2x; 2x x];
julia> b = [1 2; 2 4];
julia> cref = @constraint(model, a >= b, PSDCone())
[x - 1 2 x - 2
2 x - 2 x - 4] ∈ PSDCone()
julia> jump_function(constraint_object(cref))
4-element Vector{AffExpr}:
x - 1
2 x - 2
2 x - 2
x - 4
julia> moi_set(constraint_object(cref))
MathOptInterface.PositiveSemidefiniteConeSquare(2)
The symmetric case:
julia> using LinearAlgebra # Для Symmetric
julia> model = Model();
julia> @variable(model, x);
julia> a = [x 2x; 2x x];
julia> b = [1 2; 2 4];
julia> cref = @constraint(model, Symmetric(a - b) in PSDCone())
[x - 1 2 x - 2
⋯ x - 4] ∈ PSDCone()
julia> jump_function(constraint_object(cref))
3-element Vector{AffExpr}:
x - 1
2 x - 2
x - 4
julia> moi_set(constraint_object(cref))
MathOptInterface.PositiveSemidefiniteConeTriangle(2)
#
JuMP.Parameter
— Type
Parameter(value)
The abbreviation for the set is `MOI.Parameter'.
Example
julia> model = Model();
julia> @variable(model, x in Parameter(2))
x
julia> print(model)
Feasibility
Subject to
x ∈ MathOptInterface.Parameter{Float64}(2.0)
#
JuMP.QuadExpr
— Type
QuadExpr
An alias for GenericQuadExpr{Float64,VariableRef}
, type options GenericQuadExpr
, used in JuMP.
#
JuMP.QuadTermIterator
— Type
QuadTermIterator{GQE<:GenericQuadExpr}
A structure implementing the iterate
protocol for iterating over tuples of (efficient, variable, variable)
in `GenericQuadExpr'.
#
JuMP.RotatedSecondOrderCone
— Type
RotatedSecondOrderCone
A second-order rotated cone object that can be used to restrict the square of the Euclidean norm of the vector x
to a value less than or equal to , where t
and u
are non-negative scalar values. The abbreviation for `MOI.RotatedSecondOrderBone'.
Example
The following code imposes restrictions and :
julia> model = Model();
julia> @variable(model, x)
x
julia> @variable(model, t)
t
julia> @constraint(model, [t, x, x-1, x-2] in RotatedSecondOrderCone())
[t, x, x - 1, x - 2] ∈ MathOptInterface.RotatedSecondOrderCone(4)
#
JuMP.SOS1
— Type
SOS1(weights = Real[])
The set SOS1 (a special ordered set of type 1) restricts the vector x
to a set in which a maximum of one variable can take a non-zero value, and the remaining elements are zero.
The weights
vector, if specified, ensures the ordering of variables; for this reason, it must contain unique values. The 'weights` vector must have the same number of elements as the x
vector, and the 'weights[i]` element corresponds to the x[i]
element. If the `weights' vector is not specified, it defaults to `weights[i] = i'.
This is an abbreviation for the set `MOI.SOS1'.
Example
julia> model = Model();
julia> @variable(model, x[1:3] in SOS1([4.1, 3.2, 5.0]))
3-element Vector{VariableRef}:
x[1]
x[2]
x[3]
julia> print(model)
Feasibility
Subject to
[x[1], x[2], x[3]] ∈ MathOptInterface.SOS1{Float64}([4.1, 3.2, 5.0])
#
JuMP.SOS2
— Type
SOS2(weights = Real[])
The set SOS2 (a special ordered set of type 2) restricts the vector x
to a set in which a maximum of two variables can take a non-zero value, and the remaining elements are zero. In addition, the two non-zero values must be consecutive, taking into account the order of the elements of the vector x
provided by the vector weights
.
The weights
vector, if specified, ensures the ordering of variables; for this reason, it must contain unique values. The 'weights` vector must have the same number of elements as the x
vector, and the 'weights[i]` element corresponds to the x[i]
element. If the `weights' vector is not specified, it defaults to `weights[i] = i'.
This is an abbreviation for the set `MOI.SOS2'.
Example
julia> model = Model();
julia> @variable(model, x[1:3] in SOS2([4.1, 3.2, 5.0]))
3-element Vector{VariableRef}:
x[1]
x[2]
x[3]
julia> print(model)
Feasibility
Subject to
[x[1], x[2], x[3]] ∈ MathOptInterface.SOS2{Float64}([4.1, 3.2, 5.0])
#
JuMP.ScalarConstraint
— Type
struct ScalarConstraint
Data for the scalar constraint.
For more information, see also the documentation on representing constraints in JuMP.
Fields
-
.func
: The field contains a JuMP object representing the function. -
.set
: The field contains a set of mois.
Example
Scalar constraint:
julia> model = Model();
julia> @variable(model, x);
julia> @constraint(model, c, 2x <= 1)
c : 2 x ≤ 1
julia> object = constraint_object(c)
ScalarConstraint{AffExpr, MathOptInterface.LessThan{Float64}}(2 x, MathOptInterface.LessThan{Float64}(1.0))
julia> typeof(object)
ScalarConstraint{AffExpr, MathOptInterface.LessThan{Float64}}
julia> object.func
2 x
julia> object.set
MathOptInterface.LessThan{Float64}(1.0)
#
JuMP.ScalarShape
— Type
ScalarShape()
An object AbstractShape
representing scalar constraints.
Example
julia> model = Model();
julia> @variable(model, x[1:2]);
julia> c = @constraint(model, x[2] <= 1);
julia> shape(constraint_object(c))
ScalarShape()
#
JuMP.ScalarVariable
— Type
ScalarVariable{S,T,U,V} <: AbstractVariable
The structure used when adding variables.
See also the function description add_variable
.
#
JuMP.SecondOrderCone
— Type
SecondOrderCone
A second-order cone object that can be used to limit the Euclidean norm of the vector x
to a value less than or equal to the non-negative scalar value `t'. The abbreviation for `MOI.SecondOrderBone'.
Example
The following code imposes restrictions and :
julia> model = Model();
julia> @variable(model, x)
x
julia> @variable(model, t)
t
julia> @constraint(model, [t, x-1, x-2] in SecondOrderCone())
[t, x - 1, x - 2] ∈ MathOptInterface.SecondOrderCone(3)
#
JuMP.Semicontinuous
— Type
Semicontinuous(lower, upper)
The abbreviation for the set is `MOI.Semicontinuous'.
This abbreviation is useful because it automatically promotes lower
and upper
to the same type and converts them to the element type supported by the JuMP model.
Example
julia> model = Model();
julia> @variable(model, x in Semicontinuous(1, 2))
x
julia> print(model)
Feasibility
Subject to
x ∈ MathOptInterface.Semicontinuous{Int64}(1, 2)
#
JuMP.Semiinteger
— Type
Semiinteger(lower, upper)
The abbreviation for the set is `MOI.Semiinteger'.
This abbreviation is useful because it automatically promotes lower
and upper
to the same type and converts them to the element type supported by the JuMP model.
Example
julia> model = Model();
julia> @variable(model, x in Semiinteger(3, 5))
x
julia> print(model)
Feasibility
Subject to
x ∈ MathOptInterface.Semiinteger{Int64}(3, 5)
#
JuMP.SensitivityReport
— Type
SensitivityReport
See the description of the method lp_sensitivity_report
.
#
JuMP.SkewSymmetricMatrixShape
— Type
SkewSymmetricMatrixShape
A shape object for a skew-symmetric square matrix of `side_dimension' rows and columns. The vectorized shape contains the elements of the upper-right triangular part of the matrix (without a diagonal) in columns (or, equivalently, the elements of the lower-left triangular part in rows). The diagonal is zero.
#
JuMP.SkipModelConvertScalarSetWrapper
— Type
SkipModelConvertScalarSetWrapper(set::MOI.AbstractScalarSet)
JuMP uses model_convert'](api.md#JuMP.model_convert-Tuple{AbstractModel, Any}) to automatically promote sets [`MOI.AbstractScalarSet
to the same type value_type
, the same as the model.
If this is undesirable, enclose the set in a SkipModelConvertScalarSetWrapper
to pass it to the solver in its original form.
This structure is intended for internal use in JuMP extensions. There should be no need to use it in regular JuMP code. |
Example
julia> model = Model();
julia> @variable(model, x);
julia> @constraint(model, x in MOI.EqualTo(1 // 2))
x = 0.5
julia> @constraint(model, x in SkipModelConvertScalarSetWrapper(MOI.EqualTo(1 // 2)))
x = 1//2
#
JuMP.SquareMatrixShape
— Type
SquareMatrixShape
A shape object for a square matrix of side_dimension
rows and columns. The vectorized shape contains the elements of the matrix in columns (or, equivalently, the elements of the lower-left triangular part in rows).
#
JuMP.SymmetricMatrixAdjointShape
— Type
SymmetricMatrixAdjointShape(side_dimension)
The dual_shape
of SymmetricMatrixShape
.
This shape is not intended for regular use.
#
JuMP.SymmetricMatrixShape
— Type
SymmetricMatrixShape
A shape object for a symmetrical square matrix of `side_dimension' rows and columns. The vectorized shape contains the elements of the upper-right triangular part of the matrix in columns (or, equivalently, the elements of the lower-left triangular part in rows).
#
JuMP.SymmetricMatrixSpace
— Type
SymmetricMatrixSpace()
Used in a macro @variable
, limiting the matrix of variables to a symmetric shape.
Example
julia> model = Model();
julia> @variable(model, Q[1:2, 1:2] in SymmetricMatrixSpace())
2×2 LinearAlgebra.Symmetric{VariableRef, Matrix{VariableRef}}:
Q[1,1] Q[1,2]
Q[1,2] Q[2,2]
#
JuMP.UnorderedPair
— Type
UnorderedPair(a::T, b::T)
The wrapper type used by the type GenericQuadExpr
, with fields .a
and .b
.
Example
julia> model = Model();
julia> @variable(model, x[1:2]);
julia> expr = 2.0 * x[1] * x[2]
2 x[1]*x[2]
julia> expr.terms
OrderedCollections.OrderedDict{UnorderedPair{VariableRef}, Float64} with 1 entry:
UnorderedPair{VariableRef}(x[1], x[2]) => 2.0
#
JuMP.VariableConstrainedOnCreation
— Type
VariableConstrainedOnCreation <: AbstractVariable
The scalar_variables
variable, which should belong to the set
set.
Adding this variable is equivalent to the following code:
function JuMP.add_variable(
model::GenericModel,
variable::VariableConstrainedOnCreation,
names,
)
var_ref = add_variable(model, variable.scalar_variable, name)
add_constraint(model, VectorConstraint(var_ref, variable.set))
return var_ref
end
however, instead, variables are added by calling MOI.add_constrained_variable(model, variable.set)
.
#
JuMP.VariableInfo
— Type
VariableInfo{S,T,U,V}
The internal JuMP structure used when creating variables. It can also be used by JuMP extensions to create new types of variables.
See also the type description ScalarVariable
.
#
JuMP.VariableNotOwned
— Type
struct VariableNotOwned{V<:AbstractVariableRef} <: Exception
variable::V
end
The variable variable
was used in a model other than `owner_model(variable)'.
#
JuMP.VariablesConstrainedOnCreation
— Type
VariablesConstrainedOnCreation <: AbstractVariable
A vector of scalar_variables' variables that should belong to the `set
set. Adding this variable is equivalent to the following code:
function JuMP.add_variable(
model::GenericModel,
variable::VariablesConstrainedOnCreation,
names,
)
v_names = vectorize(names, variable.shape)
var_refs = add_variable.(model, variable.scalar_variables, v_names)
add_constraint(model, VectorConstraint(var_refs, variable.set))
return reshape_vector(var_refs, variable.shape)
end
however, instead, variables are added by calling MOI.add_constrained_variables(model, variable.set)
. For information about the difference between adding variables using MOI.add_constrained_variables
and adding them using MOI.add_variables
with the addition of constraints, see https://jump.dev/MathOptInterface.jl/v0.9.3/apireference/#Variables-1 [MOI documentation].
#
JuMP.VectorConstraint
— Type
struct VectorConstraint
Data for the vector constraint.
See also the documentation on representing constraints in JuMP.
Fields
-
func
: The field contains a JuMP object representing the function. -
set
: The field contains a set of mois. -
shape
: the field contains an objectAbstractShape
, corresponding to the shape in which the constraint was created (for example, using matrices or flat vectors).
Example
julia> model = Model();
julia> @variable(model, x[1:3]);
julia> @constraint(model, c, x in SecondOrderCone())
c : [x[1], x[2], x[3]] ∈ MathOptInterface.SecondOrderCone(3)
julia> object = constraint_object(c)
VectorConstraint{VariableRef, MathOptInterface.SecondOrderCone, VectorShape}(VariableRef[x[1], x[2], x[3]], MathOptInterface.SecondOrderCone(3), VectorShape())
julia> typeof(object)
VectorConstraint{VariableRef, MathOptInterface.SecondOrderCone, VectorShape}
julia> object.func
3-element Vector{VariableRef}:
x[1]
x[2]
x[3]
julia> object.set
MathOptInterface.SecondOrderCone(3)
julia> object.shape
VectorShape()
#
JuMP.VectorShape
— Type
VectorShape()
An object AbstractShape
, representing vector-valued constraints.
Example
julia> model = Model();
julia> @variable(model, x[1:2]);
julia> c = @constraint(model, x in SOS1());
julia> shape(constraint_object(c))
VectorShape()
#
JuMP.Zeros
— Type
Zeros()
The equivalent of the set of MOI.Zeros
in JuMP with the output of the measurement from the corresponding function.
Example
julia> model = Model();
julia> @variable(model, x[1:2])
2-element Vector{VariableRef}:
x[1]
x[2]
julia> @constraint(model, x in Zeros())
[x[1], x[2]] ∈ MathOptInterface.Zeros(2)
julia> A = [1 2; 3 4];
julia> b = [5, 6];
julia> @constraint(model, A * x == b)
[x[1] + 2 x[2] - 5, 3 x[1] + 4 x[2] - 6] ∈ MathOptInterface.Zeros(2)
#
JuMP._VariableValueMap
— _Type
_VariableValueMap{F}
A deferred cache used to calculate a direct solution for variables in value
.
Eliminates the need to rewrite non-linear expressions from MOI*VARIABLE to VARIABLE, as well as immediately calculate the var*value
for each variable.The cache
is used so that you don’t have to recalculate variables that have already been encountered.
#
Base.copy
— Method
copy(model::AbstractModel)
Returns a copy of the model'. It works similarly `copy_model' with the exception that it does not return a mapping between the references of the `model
and its copy.
Note
Copying of the model is not supported in the DIRECT
mode, that is, when the model is created using the constructor. direct_model
, not the constructor Model
. In addition, regardless of whether an optimizer was provided when creating the model, the new model will not have an optimizer, meaning it needs to be provided with an optimizer in the call. optimize!
.
Example
The following example creates a model
with an x
variable and a cref
constraint. It is then copied to the new_model
model with new references assigned to x_new
and `cref_new'.
julia> model = Model();
julia> @variable(model, x)
x
julia> @constraint(model, cref, x == 2)
cref : x = 2
julia> new_model = copy(model);
julia> x_new = model[:x]
x
julia> cref_new = model[:cref]
cref : x = 2
#
Base.empty!
— Method
empty!(model::GenericModel)::GenericModel
Clears the model, that is, removes all variables, constraints, and attributes of the model, but not the attributes of the optimizer. Always returns an argument.
Note: Deletes extension data.
Example
julia> model = Model();
julia> @variable(model, x[1:2]);
julia> isempty(model)
false
julia> empty!(model)
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: AUTOMATIC
CachingOptimizer state: NO_OPTIMIZER
Solver name: No optimizer attached.
julia> print(model)
Feasibility
Subject to
julia> isempty(model)
true
#
Base.getindex
— Method
Base.getindex(m::JuMP.AbstractModel, name::Symbol)
Simplifies access to JuMP variables and constraints using the []
syntax.
Returns a variable or group of variables (either a constraint or a group of constraints) with the specified name that were added to the model. If multiple variables or constraints have the same name, an error occurs.
#
Base.haskey
— Method
haskey(model::AbstractModel, name::Symbol)
Determines whether the model has a mapping for the given name.
#
Base.isempty
— Method
isempty(model::GenericModel)
Checks whether the model is empty, that is, whether the MOI backend is empty and whether the model is in the same state as when it was created, except for the attributes of the optimizer.
Example
julia> model = Model();
julia> isempty(model)
true
julia> @variable(model, x[1:2]);
julia> isempty(model)
false
#
Base.read
— Method
Base.read(
io::IO,
::Type{<:GenericModel};
format::MOI.FileFormats.FileFormat,
kwargs...,
)
Returns the JuMP model read from io
in the `format' format.
Other kwargs' arguments are passed to the 'Model
constructor of the selected format.
#
Base.setindex!
— Method
Base.setindex!(m::JuMP.AbstractModel, value, name::Symbol)
Saves the 'value` object in the 'm` model so that it can be accessed via getindex'. It can be called using the syntax `[]
.
#
Base.show
— _Method
Base.show([io::IO], summary::SolutionSummary; verbose::Bool = false)
Writes a summary of the solution results to io
(or to stdout' if the `io
argument is not specified).
#
Base.write
— Method
Base.write(
io::IO,
model::GenericModel;
format::MOI.FileFormats.FileFormat = MOI.FileFormats.FORMAT_MOF,
kwargs...,
)
Writes the JuMP model' to `io
in the format
format.
Other kwargs' arguments are passed to the 'Model
constructor of the selected format.
#
JuMP.BinaryRef
— Method
BinaryRef(v::GenericVariableRef)
Returns a reference to the constraint that makes the variable v
binary. If it does not exist, it returns an error.
See also the description is_binary
, set_binary
and unset_binary
.
Example
julia> model = Model();
julia> @variable(model, x, Bin);
julia> BinaryRef(x)
x binary
#
JuMP.FixRef
— Method
FixRef(v::GenericVariableRef)
Returns a reference to the constraint that fixes the value of the variable `v'.
If it doesn’t exist, it returns an error.
Example
julia> model = Model();
julia> @variable(model, x == 1);
julia> FixRef(x)
x = 1
#
JuMP.IntegerRef
— Method
IntegerRef(v::GenericVariableRef)
Returns a reference to the constraint that makes the variable v
an integer.
If it doesn’t exist, it returns an error.
See also the description is_integer
, set_integer
and unset_integer
.
Example
julia> model = Model();
julia> @variable(model, x, Int);
julia> IntegerRef(x)
x integer
#
JuMP.LowerBoundRef
— Method
LowerBoundRef(v::GenericVariableRef)
Returns a reference to the restriction of the lower bound of the variable `v'.
If it does not exist, it returns an error.
See also the description has_lower_bound
, lower_bound
, set_lower_bound
and delete_lower_bound
.
Example
julia> model = Model();
julia> @variable(model, x >= 1.0);
julia> LowerBoundRef(x)
x ≥ 1
#
JuMP.NLPEvaluator
— Method
NLPEvaluator(
model::Model,
_differentiation_backend::MOI.Nonlinear.AbstractAutomaticDifferentiation =
MOI.Nonlinear.SparseReverseMode(),
)
Returns the object MOI.AbstractNLPEvaluator
, created based on model
Before using the evaluation tool, it must be initialized using `MOI.initialize'. |
Experimental features
These features may be changed or removed in any future version of JuMP.
Pass `_differentiation_backend' to specify the differentiation backend for calculating derivatives.
#
JuMP.ParameterRef
— Method
ParameterRef(x::GenericVariableRef)
Returns a reference to the constraint that makes x
a parameter.
If it does not exist, it returns an error.
See also the description is_parameter
, set_parameter_value
and parameter_value
.
Example
julia> model = Model();
julia> @variable(model, p in Parameter(2))
p
julia> ParameterRef(p)
p ∈ MathOptInterface.Parameter{Float64}(2.0)
julia> @variable(model, x);
julia> ParameterRef(x)
ERROR: Variable x is not a parameter.
Stacktrace:
[...]
#
JuMP.UpperBoundRef
— Method
UpperBoundRef(v::GenericVariableRef)
Returns a reference to the restriction of the upper bound of the variable `v'.
If it does not exist, it returns an error.
See also the description has_upper_bound
, upper_bound
, set_upper_bound
and delete_upper_bound
.
Example
julia> model = Model();
julia> @variable(model, x <= 1.0);
julia> UpperBoundRef(x)
x ≤ 1
#
JuMP._compute_rhs_range
— _Method
_compute_rhs_range(d_B, x_B, l_B, u_B, atol)
Let’s assume that we start with the optimal solution x_old
and want to calculate the step size t
in the direction d
in such a way that x_new = x_old + t * d
is still represented by the same optimal basis. This can be calculated using the simple simplex method using an artificial input variable.
A * x_new = A * (x_old + t * d) = A * x_old + t * A * d = 0 + t * A * d # Так как A * x_old = 0 => A * d = 0 => B * d_B + N * d_N = 0 => d_B = B \ -(N * d_N)
Note that we only need to calculate the base component of the direction vector, since d_N
is just zeros with 1
in the component associated with the artificial incoming variable. Therefore, it remains only to calculate the associated column N
.
If we increase the boundaries associated with the i’th solution variable, then the artificial incoming variable will be a copy of the `i’th variable, and `N * d_N = A[:, i]
.
If we increase the boundaries associated with the i’th affine constraint, then the artificial incoming variable will be a copy of the weakening variable associated with the `i’th constraint, that is, `-1
in the `i’th row and zeros in the rest.
Anyway:
d_B = -(B \ A[:, i])
We have calculated the direction such that x_new = x_old + t *d'. By observing the condition `A * d = 0', we have ensured structural tolerance. Now it is necessary to calculate the boundaries for `t
so that x_new
preserves the validity of the boundaries. In other words, we need to calculate the bounds for t so that the following is true:
l_B[j] <= x_B[j] + t * d_B[j] <= u_B[j].
#
JuMP._desparsify
— _Method
_desparsify(x)
If x
is an array of AbstractSparseArray
, it returns the dense equivalent; otherwise, it simply returns `x'.
This function is used in `_build_constraint'.
-
Why is it needed?*
When broadcasting f.(x)
via AbstractSparseArray
x
Julia first calls the equivalent of f(zero(eltype(x))
. Here is an example:
julia> import SparseArrays
julia> foo(x) = (println("Calling $(x)"); x)
foo (generic function with 1 method)
julia> foo.(SparseArrays.sparsevec([1, 2], [1, 2]))
Calling 1
Calling 2
2-element SparseArrays.SparseVector{Int64, Int64} with 2 stored entries:
[1] = 1
[2] = 2
However, if the function f
is a changeling, it can have serious consequences! In our case, when broadcasting build_constraint
, a new constraint 0 = 0
is added.
Sparse arrays are most often formed when the input data for a constraint is sparse (for example, a constant vector or a matrix). Due to the promotion and arithmetic operations, a constraint function is obtained, which is represented by an array of AbstractSparseArray
, but is actually dense. Therefore, you can safely assemble (collect
) the matrix into a dense array.
If the function is sparse, the procedure is not obvious. What will be the "zero" element of the result? What does it mean to broadcast build_constraint
over a sparse array with the addition of scalar constraints? Most likely, this means that the user is applying the wrong data structure. For simplicity, let’s also call collect
for a dense array and see if there are any problems.
#
JuMP._eval_as_variable
— _Method
_eval_as_variable(f::F, x::GenericAffExpr, args...) where {F}
Often, the macro @variable
can return GenericAffExpr
instead of GenericVariableRef
. This is especially true for expressions with complex values. In order for standard operations such as lower_bound(x)
to be performed, the method should be redirected if and only if x
can be converted to GenericVariableRef
.
#
JuMP._fill_vaf!
— _Method
_fill_vaf!(
terms::Vector{<:MOI.VectorAffineTerm},
offset::Int,
oi::Int,
aff::AbstractJuMPScalar,
)
Fills in the members of vectors with indexes starting with `offset+1' with affine members of `aff'. The output index for all members is `oi'. Returns the index of the last added member.
#
JuMP._fill_vqf!
— _Method
_fill_vqf!(terms::Vector{<:MOI.VectorQuadraticTerm}, offset::Int, oi::Int,
quad::AbstractJuMPScalar)
Fills in the members of vectors with indexes starting with offset+1
with quadratic terms of `quad'. The output index for all members is `oi'. Returns the index of the last added member.
#
JuMP._finalize_macro
— _Method
_finalize_macro(
model,
code,
source::LineNumberNode;
register_name::Union{Nothing,Symbol} = nothing,
wrap_let::Bool = false,
)
Encloses the code
code generated by the macro in a block of code with the first argument source
, that is, the node LineNumberNode
from which the macro was called in the user code. Improves stack traces in error messages.
In addition, this function checks whether the model' model is a valid `AbstractModel
object.
If register_name
is of type Symbol', the result of executing the `code
in the model
is registered under the name register_name
.
If wrap_let
, the code
is enclosed in the let model = model
block, which makes the model a local variable.
#
JuMP._moi_quadratic_term
— _Method
_moi_quadratic_term(t::Tuple)
Returns MOI.ScalarQuadraticTerm for the quadratic term t', an iterator element `quad_terms
. Note that the VariableRef
references are converted to MOI.VariableIndex
indexes, so information about the owner model is lost.
#
JuMP._nlp_objective_function
— _Method
_nlp_objective_function(model::GenericModel)
Returns a non-linear objective function or nothing
if it is not specified.
#
JuMP._parse_nonlinear_expression
— _Method
_parse_nonlinear_expression(model::GenericModel, x::Expr)
JuMP needs to create non-linear expression objects in the macro area. There are two main problems with this.:
-
Local variables must be evaluated in expressions. It’s pretty
it’s simple: any character that is not a function call is replaced with esc(x)
.
-
Unregistered user functions must be identified so that you can try to register them automatically if their symbolic names exist in the scope. Automatic registration was originally introduced (@odow) in https://github.com/jump-dev/JuMP .jl/pull/2537 to fix a common problem in JuMP, but as time passed, it was a mistake. One of the problems is that the analysis of nonlinear expressions has shifted from the time of macro expansion to runtime. This is a big plus for system readability, but it also means that access to the caller’s local area is lost. The best solution to maintain backward compatibility is to check whether each function call is registered before analyzing the expression.
#
JuMP._print_latex
— _Method
_print_latex(io::IO, model::AbstractModel)
Outputs the LaTeX formulation of the model
in `io'.
For this method to work, you need to implement the AbstractModel
subtype.:
-
objective_function_string
-
constraints_string
-
_nl_subexpression_string
#
JuMP._print_model
— _Method
_print_model(io::IO, model::AbstractModel)
Outputs the formulation of the model model
in plain text in `io'.
For this method to work, you need to implement the AbstractModel
subtype.:
-
objective_function_string
-
constraints_string
-
_nl_subexpression_string
#
JuMP._print_summary
— _Method
_print_summary(io::IO, model::AbstractModel)
Outputs a summary of the model
in plain text in `io'.
For this method to work, you need to implement the AbstractModel
subtype:
-
name(::AbstractModel)
-
show_objective_function_summary
-
show_constraints_summary
-
show_backend_summary
#
JuMP._replace_zero
— _Method
_replace_zero(model::M, x) where {M<:AbstractModel}
Replaces _MA.Zero
to zero(value_type(M))
with floating point.
#
JuMP._rewrite_expression
— _Method
_rewrite_expression(expr)
If the `expr' object is not `Expr', nothing happens when it is overwritten. If it is changeable, you just need to copy it so that subsequent operations do not change the user’s data.
#
JuMP._rewrite_expression
— _Method
_rewrite_expression(expr)
An auxiliary function that allows you to define a way to overwrite expressions in one place and distribute it to all places in JuMP macros that overwrite expressions.
#
JuMP._standard_form_matrix
— _Method
_standard_form_matrix(model::GenericModel)
See instead the description of the method. lp_matrix_data
.
#
JuMP.add_bridge
— Method
add_bridge(
model::GenericModel{T},
BT::Type{<:MOI.Bridges.AbstractBridge};
coefficient_type::Type{S} = T,
) where {T,S}
Adds BT{T}
to the list of bridges that can be used to convert unsupported constraints into an equivalent form supported by the optimizer.
See also the description of the method remove_bridge
.
Example
julia> model = Model();
julia> add_bridge(model, MOI.Bridges.Constraint.SOCtoNonConvexQuadBridge)
julia> add_bridge(
model,
MOI.Bridges.Constraint.NumberConversionBridge;
coefficient_type = Complex{Float64}
)
#
JuMP.add_constraint
— Function
add_constraint(
model::GenericModel,
con::AbstractConstraint,
name::String= "",
)
This method should only be implemented by developers who create JuMP extensions. It should never be invoked by JuMP users.
#
JuMP.add_nonlinear_constraint
— Method
add_nonlinear_constraint(model::Model, expr::Expr)
Adds a non-linear constraint described by the Julia ex
expression to the `model'.
This function is most useful if the expression ex
is generated programmatically and used @NLconstraint
is not possible.
Compatibility
This feature is part of an outdated non-linear interface. It is recommended to use a new non-linear interface, which is described in the section Nonlinear Modeling. |
Notes
-
Variables should be interpolated directly into the expression `expr'.
Example
julia> model = Model();
julia> @variable(model, x);
julia> add_nonlinear_constraint(model, :($(x) + $(x)^2 <= 1))
(x + x ^ 2.0) - 1.0 ≤ 0
#
JuMP.add_nonlinear_expression
— Method
add_nonlinear_expression(model::Model, expr::Expr)
Adds a non-linear expression expr
to `model'.
This function is most useful if the expression expr
is generated programmatically and used @NLexpression
is not possible.
Compatibility
This feature is part of an outdated non-linear interface. It is recommended to use a new non-linear interface, which is described in the section Nonlinear Modeling. |
Notes
-
Variables should be interpolated directly into the expression `expr'.
Example
julia> model = Model();
julia> @variable(model, x);
julia> add_nonlinear_expression(model, :($(x) + $(x)^2))
subexpression[1]: x + x ^ 2.0
#
JuMP.add_nonlinear_operator
— Method
add_nonlinear_operator(
model::Model,
dim::Int,
f::Function,
[∇f::Function,]
[∇²f::Function];
[name::Symbol = Symbol(f),]
)
Adds a new nonlinear operator with dim
input arguments to model
and associates it with the name `name'.
The function f
evaluates the operator and must return a scalar value.
The optional function ∇f
calculates the first derivative, and the optional function ∇2f
calculates the second derivative.
The function ∇2f
can only be passed if the function ∇f
is passed.
One-dimensional syntax
With dim == 1
, the method signatures of each function should be as follows:
-
f(::T)::T where {T<:Real}
-
∇f(::T)::T where {T<:Real}
-
∇²f(::T)::T where {T<:Real}
Multidimensional syntax
For dim > 1
, the method signatures of each function should be as follows:
-
f(x::T...)::T where {T<:Real}
-
∇f(g::AbstractVector{T}, x::T...)::Nothing where {T<:Real}
-
∇²f(H::AbstractMatrix{T}, x::T...)::Nothing where {T<:Real}
Where the gradient vector 'g` and the Hessian matrix H
are filled in place. For the Hessian, only the nonzero elements of the lower triangle need to be filled in. Setting an off-diagonal element of the upper triangle may result in an error.
Example
julia> model = Model();
julia> @variable(model, x)
x
julia> f(x::Float64) = x^2
f (generic function with 1 method)
julia> ∇f(x::Float64) = 2 * x
∇f (generic function with 1 method)
julia> ∇²f(x::Float64) = 2.0
∇²f (generic function with 1 method)
julia> op_f = add_nonlinear_operator(model, 1, f, ∇f, ∇²f)
NonlinearOperator(f, :f)
julia> @objective(model, Min, op_f(x))
f(x)
julia> op_f(2.0)
4.0
#
JuMP.add_nonlinear_parameter
— Method
add_nonlinear_parameter(model::Model, value::Real)
Adds an anonymous parameter to the model.
Compatibility
This feature is part of an outdated non-linear interface. It is recommended to use a new non-linear interface, which is described in the section Nonlinear Modeling. |
#
JuMP.add_to_expression!
— Function
add_to_expression!(expression, terms...)
Updates the expression expression
in place to expression+(*)(terms...)
.
This is usually much more efficient than expression+= (*)(terms...)
, because it avoids temporarily storing a member from the right side in memory.
For example, add_to_expression!(expression, a, b)
gives the same result as expression +=a*b
, and add_to_expression!(expression, a)
gives the same result as `expression +=a'.
Terms of implementation
Only a few methods have been defined, mainly for internal use and only for the following cases:
-
The methods can be effectively implemented.
-
expression
can save the result. For example, the method
add_to_expression!(::AffExpr, ::GenericVariableRef, ::GenericVariableRef)
it is not defined because GenericAffExpr
cannot store the product of two variables.
Example
julia> model = Model();
julia> @variable(model, x)
x
julia> expr = 2 + x
x + 2
julia> add_to_expression!(expr, 3, x)
4 x + 2
julia> expr
4 x + 2
#
JuMP.add_to_function_constant
— Method
add_to_function_constant(constraint::ConstraintRef, value)
Adds value
to the free member of the constraint
function.
Note that for scalar constraints, JuMP outputs all free terms to the right side of the constraint, so instead of changing the function, the set will be converted to -value
. For example, the constraint 2x <= 3
will be changed by the method add_to_function_constant(c, 4)
to 2x <= -1
.
Example
For scalar constraints, the set is converted to -value
:
julia> model = Model();
julia> @variable(model, x);
julia> @constraint(model, con, 0 <= 2x - 1 <= 2)
con : 2 x ∈ [1, 3]
julia> add_to_function_constant(con, 4)
julia> con
con : 2 x ∈ [-3, -1]
For vector constraints, a constant is added to the function:
julia> model = Model();
julia> @variable(model, x);
julia> @variable(model, y);
julia> @constraint(model, con, [x + y, x, y] in SecondOrderCone())
con : [x + y, x, y] ∈ MathOptInterface.SecondOrderCone(3)
julia> add_to_function_constant(con, [1, 2, 2])
julia> con
con : [x + y + 1, x + 2, y + 2] ∈ MathOptInterface.SecondOrderCone(3)
#
JuMP.add_variable
— Function
add_variable(m::GenericModel, v::AbstractVariable, name::String = "")
This method should only be implemented by developers who create JuMP extensions. It should never be invoked by JuMP users.
#
JuMP.all_constraints
— Method
all_constraints(model::GenericModel, function_type, set_type)::Vector{<:ConstraintRef}
Returns a list of all constraints that currently exist in the model, where the function is of type function_type
and the set is of type `set_type'. The restrictions are ordered by creation time.
See also the description of the methods list_of_constraint_types
and num_constraints
.
Example
julia> model = Model();
julia> @variable(model, x >= 0, Bin);
julia> @constraint(model, 2x <= 1);
julia> all_constraints(model, VariableRef, MOI.GreaterThan{Float64})
1-element Vector{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.VariableIndex, MathOptInterface.GreaterThan{Float64}}, ScalarShape}}:
x ≥ 0
julia> all_constraints(model, VariableRef, MOI.ZeroOne)
1-element Vector{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.VariableIndex, MathOptInterface.ZeroOne}, ScalarShape}}:
x binary
julia> all_constraints(model, AffExpr, MOI.LessThan{Float64})
1-element Vector{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.LessThan{Float64}}, ScalarShape}}:
2 x ≤ 1
#
JuMP.all_constraints
— Method
all_constraints(
model::GenericModel;
include_variable_in_set_constraints::Bool,
)::Vector{ConstraintRef}
Retrieves a list of all constraints in the model
.
When include_variable_in_set_constraints == true
, VariableRef
constraints are enabled, such as “VariableRef` in `Integer”. To return only structural constraints (for example, rows in the constraint matrix of a linear program), pass `include_variable_in_set_constraints = false'.
Example
julia> model = Model();
julia> @variable(model, x >= 0, Int);
julia> @constraint(model, 2x <= 1);
julia> @NLconstraint(model, x^2 <= 1);
julia> all_constraints(model; include_variable_in_set_constraints = true)
4-element Vector{ConstraintRef}:
2 x ≤ 1
x ≥ 0
x integer
x ^ 2.0 - 1.0 ≤ 0
julia> all_constraints(model; include_variable_in_set_constraints = false)
2-element Vector{ConstraintRef}:
2 x ≤ 1
x ^ 2.0 - 1.0 ≤ 0
Performance Notes
Note that this function is unstable in type, as it returns a vector of an abstract type. If performance is important, we recommend using list_of_constraint_types
](api.md#JuMP.list_of_constraint_types-Tuple{GenericModel}) and a functional barrier. For more information, see the [Performance tips for extensions] section of the documentation.
#
JuMP.all_nonlinear_constraints
— Method
all_nonlinear_constraints(model::GenericModel)
Returns a vector of all references to nonlinear constraints in the model in the order in which they were added to the model.
Compatibility
This feature is part of an outdated non-linear interface. It is recommended to use a new non-linear interface, which is described in the section Nonlinear Modeling. |
This function returns only the constraints that were added using @NLconstraint
and add_nonlinear_constraint
. It does not return the restrictions GenericNonlinearExpr
.
#
JuMP.all_variables
— Method
all_variables(model::GenericModel{T})::Vector{GenericVariableRef{T}} where {T}
Retrieves a list of all variables currently available in the model. The variables are ordered by creation time.
Example
julia> model = Model();
julia> @variable(model, x);
julia> @variable(model, y);
julia> all_variables(model)
2-element Vector{VariableRef}:
x
y
#
JuMP.anonymous_name
— Method
anonymous_name(::MIME, x::AbstractVariableRef)
The name used for the anonymous variable x
in the output.
Example
julia> model = Model();
julia> x = @variable(model);
julia> anonymous_name(MIME("text/plain"), x)
"_[1]"
#
JuMP.backend
— Method
backend(model::GenericModel)
Returns the low-level MathOptInterface model on which the JuMP model is based. This model depends on the mode of operation of JuMP (see the description of the mode
method).
-
If JuMP is running in
DIRECT
mode (that is, the model was created usingdirect_model
), the backend will be the optimizer passed todirect_model
. -
If JuMP is running in
MANUAL
orAUTOMATIC
mode, the backend is `MOI.Utilities.CachingOptimizer'.
To get the index of a variable or constraint in the backend model, use the method index
.
This feature should only be used by advanced users who need access to the low-level capabilities of the MathOptInterface or solver. |
Notes
If the JuMP is not in DIRECT
mode, the type returned by the backend
may vary depending on the release of the JuMP. Therefore, use only the public API provided by MathOptInterface and do not access internal fields. If you need access to the internal optimizer itself, see the description of the method. unsafe_backend
. In addition, using the method direct_model'
it is possible to create a JuMP model in the `DIRECT mode.
See also the description of the method unsafe_backend
.
Example
julia> import HiGHS
julia> model = direct_model(HiGHS.Optimizer());
julia> set_silent(model)
julia> @variable(model, x >= 0)
x
julia> highs = backend(model)
A HiGHS model with 1 columns and 0 rows.
julia> index(x)
MOI.VariableIndex(1)
#
JuMP.barrier_iterations
— Method
barrier_iterations(model::GenericModel)
If available, returns the cumulative number of barrier iterations during the last optimization (the MOI.BarrierIterations
attribute).
If this attribute is not implemented by the solver, it returns the error `MOI.GetAttributeNotAllowed'.
Example
julia> import HiGHS
julia> model = Model(HiGHS.Optimizer);
julia> set_silent(model)
julia> optimize!(model)
julia> barrier_iterations(model)
0
#
JuMP.bridge_constraints
— Method
bridge_constraints(model::GenericModel)
In direct mode, it returns the value `false'.
In manual or automatic mode, it returns a value of type Bool
, indicating whether the optimizer is set and whether unsupported constraints are automatically converted to equivalent supported constraints, if such a conversion is available.
Example
julia> import Ipopt
julia> model = Model(Ipopt.Optimizer);
julia> bridge_constraints(model)
true
julia> model = Model(Ipopt.Optimizer; add_bridges = false);
julia> bridge_constraints(model)
false
#
JuMP.build_constraint
— Method
build_constraint(error_fn::Function, func, set, args...; kwargs...)
This method should only be implemented by developers who create JuMP extensions. It should never be invoked by JuMP users.
#
JuMP.build_variable
— Method
build_variable(
error_fn::Function,
info::VariableInfo,
args...;
kwargs...,
)
Returns a new object AbstractVariable
.
This method should only be implemented by developers who create JuMP extensions. It should never be invoked by JuMP users.
Arguments
-
error_fn': a function called instead of `error
.error_fn
annotates the error message with additional information for the user. -
info': instance `VariableInfo
. It has a number of fields associated with a variable, for exampleinfo.lower_bound
and `info.binary'. -
args': optional additional positional arguments for expansion
@variable` macro. -
kwargs
: optional named arguments for macro expansion@variable
.
See also the description of the macro @variable
.
Extensions should define a method with ONE positional argument for dispatching a call to another method. If the extension has several positional arguments, then |
Example
@variable(model, x, Foo)
will cause
build_variable(error_fn::Function, info::VariableInfo, ::Type{Foo})
Special positional arguments such as Bin
, Int
, and PSD
can be passed along with named ones.:
@variable(model, x, Int, Foo(), mykwarg = true)
# или
@variable(model, x, Foo(), Int, mykwarg = true)
will cause
build_variable(error_fn::Function, info::VariableInfo, ::Foo; mykwarg)
and info.integer
will have the value true.
Note that the order of the positional arguments does not matter.
#
JuMP.callback_node_status
— Method
callback_node_status(cb_data, model::GenericModel)
Returns the enumeration MOI.CallbackNodeStatusCode
, indicating whether the current simple solution is available from callback_value
, integer valid.
Example
julia> import GLPK
julia> model = Model(GLPK.Optimizer);
julia> @variable(model, x <= 10, Int);
julia> @objective(model, Max, x);
julia> function my_callback_function(cb_data)
status = callback_node_status(cb_data, model)
println("Status is: ", status)
return
end
my_callback_function (generic function with 1 method)
julia> set_attribute(model, GLPK.CallbackFunction(), my_callback_function)
julia> optimize!(model)
Status is: CALLBACK_NODE_STATUS_UNKNOWN
Status is: CALLBACK_NODE_STATUS_UNKNOWN
Status is: CALLBACK_NODE_STATUS_INTEGER
Status is: CALLBACK_NODE_STATUS_INTEGER
#
JuMP.callback_value
— Method
callback_value(cb_data, x::GenericVariableRef)
callback_value(cb_data, x::Union{GenericAffExpr,GenericQuadExpr})
Returns a simple solution x
inside the callback.
'cb_data' is an argument to the callback function, and its type depends on the solver.
To check the availability of the solution, use callback_node_status
.
Example
julia> import GLPK
julia> model = Model(GLPK.Optimizer);
julia> @variable(model, x <= 10, Int);
julia> @objective(model, Max, x);
julia> function my_callback_function(cb_data)
status = callback_node_status(cb_data, model)
if status == MOI.CALLBACK_NODE_STATUS_INTEGER
println("Solution is: ", callback_value(cb_data, x))
end
return
end
my_callback_function (generic function with 1 method)
julia> set_attribute(model, GLPK.CallbackFunction(), my_callback_function)
julia> optimize!(model)
Solution is: 10.0
Solution is: 10.0
#
JuMP.check_belongs_to_model
— Function
check_belongs_to_model(x::AbstractJuMPScalar, model::AbstractModel)
check_belongs_to_model(x::AbstractConstraint, model::AbstractModel)
Throws an exception VariableNotOwned
if 'owner_model' is not a model
for x
.
Example
julia> model = Model();
julia> @variable(model, x);
julia> check_belongs_to_model(x, model)
julia> model_2 = Model();
julia> check_belongs_to_model(x, model_2)
ERROR: VariableNotOwned{VariableRef}(x): the variable x cannot be used in this model because
it belongs to a different model.
[...]
#
JuMP.check_belongs_to_model
— Method
check_belongs_to_model(con_ref::ConstraintRef, model::AbstractModel)
Throws a 'ConstraintNotOwned` exception if owner_model(con_ref)
is not a model
.
#
JuMP.coefficient
— Method
coefficient(v1::GenericVariableRef{T}, v2::GenericVariableRef{T}) where {T}
Returns one(T)
if v1 == v2
, and zero(T)
otherwise.
This is a backup option for other methods. efficient
, simplifying code in which an expression can be a separate variable.
Example
julia> model = Model();
julia> @variable(model, x[1:2]);
julia> coefficient(x[1], x[1])
1.0
julia> coefficient(x[1], x[2])
0.0
#
JuMP.coefficient
— Method
coefficient(a::GenericAffExpr{C,V}, v::V) where {C,V}
Returns the coefficient associated with the variable v
in the affine expression `a'.
Example
julia> model = Model();
julia> @variable(model, x);
julia> expr = 2.0 * x + 1.0;
julia> coefficient(expr, x)
2.0
#
JuMP.coefficient
— Method
coefficient(a::GenericQuadExpr{C,V}, v1::V, v2::V) where {C,V}
Returns the coefficient associated with the term 'v1 * v2` in the quadratic expression `a'.
Note that the call to `efficient(a, v1, v2)`is equivalent to `coefficient(a, v2, v1)'.
Example
julia> model = Model();
julia> @variable(model, x[1:2]);
julia> expr = 2.0 * x[1] * x[2];
julia> coefficient(expr, x[1], x[2])
2.0
julia> coefficient(expr, x[2], x[1])
2.0
julia> coefficient(expr, x[1], x[1])
0.0
#
JuMP.coefficient
— Method
coefficient(a::GenericQuadExpr{C,V}, v::V) where {C,V}
Returns the coefficient associated with the variable v
in the affine component `a'.
Example
julia> model = Model();
julia> @variable(model, x);
julia> expr = 2.0 * x^2 + 3.0 * x;
julia> coefficient(expr, x)
3.0
#
JuMP.compute_conflict!
— Method
compute_conflict!(model::GenericModel)
Calculates a conflict if the model is unsolvable.
The conflict is also called an irreducible invalid subsystem (IIS).
If the optimizer has not been set yet (see the description of the method set_optimizer
), an error is returned NoOptimizer
.
The conflict status can be checked using the model attribute MOI.ConflictStatus'. After that, the state of each constraint can be queried using the `MOI.ConstraintConflictStatus
attribute.
See also the description of the method copy_conflict
.
Example
julia> using JuMP
julia> model = Model(Gurobi.Optimizer);
julia> set_silent(model)
julia> @variable(model, x >= 0);
julia> @constraint(model, c1, x >= 2);
julia> @constraint(model, c2, x <= 1);
julia> optimize!(model)
julia> compute_conflict!(model)
julia> get_attribute(model, MOI.ConflictStatus())
CONFLICT_FOUND::ConflictStatusCode = 3
#
JuMP.constant
— Method
constant(aff::GenericAffExpr{C,V})::C
Returns the constant of an affine expression.
Example
julia> model = Model();
julia> @variable(model, x);
julia> aff = 2.0 * x + 3.0;
julia> constant(aff)
3.0
#
JuMP.constant
— Method
constant(quad::GenericQuadExpr{C,V})::C
Returns the constant of the quadratic expression.
Example
julia> model = Model();
julia> @variable(model, x);
julia> quad = 2.0 * x^2 + 3.0;
julia> constant(quad)
3.0
#
JuMP.constraint_by_name
— Function
constraint_by_name(model::AbstractModel, name::String, [F, S])::Union{ConstraintRef,Nothing}
Returns a reference to a restriction with the name attribute name
or Nothing
if none of the restrictions have such a name attribute.
If several constraints have the name attribute `name', it returns an error.
If the arguments F
and S
are specified, this method also returns an error if the constraint is not of the type “F` in S”, where `F
is the type of JuMP or MOI function, and S
is the type of MOI set.
It is recommended to specify F
and S
if the types of the function and set are known, since the return type can be deduced, while for the method presented above (that is, without F
and S
) the exact returned type of the restriction index cannot be deduced.
Example
julia> model = Model();
julia> @variable(model, x)
x
julia> @constraint(model, con, x^2 == 1)
con : x² = 1
julia> constraint_by_name(model, "kon")
julia> constraint_by_name(model, "con")
con : x² = 1
julia> constraint_by_name(model, "con", AffExpr, MOI.EqualTo{Float64})
julia> constraint_by_name(model, "con", QuadExpr, MOI.EqualTo{Float64})
con : x² = 1
#
JuMP.constraint_object
— Function
constraint_object(con_ref::ConstraintRef)
Returns the base data for the constraint referenced by `con_ref'.
Example
Scalar constraint:
julia> model = Model();
julia> @variable(model, x);
julia> @constraint(model, c, 2x <= 1)
c : 2 x ≤ 1
julia> object = constraint_object(c)
ScalarConstraint{AffExpr, MathOptInterface.LessThan{Float64}}(2 x, MathOptInterface.LessThan{Float64}(1.0))
julia> typeof(object)
ScalarConstraint{AffExpr, MathOptInterface.LessThan{Float64}}
julia> object.func
2 x
julia> object.set
MathOptInterface.LessThan{Float64}(1.0)
Vector constraint:
julia> model = Model();
julia> @variable(model, x[1:3]);
julia> @constraint(model, c, x in SecondOrderCone())
c : [x[1], x[2], x[3]] ∈ MathOptInterface.SecondOrderCone(3)
julia> object = constraint_object(c)
VectorConstraint{VariableRef, MathOptInterface.SecondOrderCone, VectorShape}(VariableRef[x[1], x[2], x[3]], MathOptInterface.SecondOrderCone(3), VectorShape())
julia> typeof(object)
VectorConstraint{VariableRef, MathOptInterface.SecondOrderCone, VectorShape}
julia> object.func
3-element Vector{VariableRef}:
x[1]
x[2]
x[3]
julia> object.set
MathOptInterface.SecondOrderCone(3)
#
JuMP.constraint_ref_with_index
— Method
constraint_ref_with_index(model::AbstractModel, index::MOI.ConstraintIndex)
Returns the ConstraintRef
reference of the `model' corresponding to the `index'.
This is an auxiliary function used inside JuMP and some JuMP extensions. There should be no need to call it in user code.
#
JuMP.constraint_string
— Method
constraint_string(
mode::MIME,
ref::ConstraintRef;
in_math_mode::Bool = false,
)
Returns the string representation of the ref
constraint for the `mode' mode.
Example
julia> model = Model();
julia> @variable(model, x);
julia> @constraint(model, c, 2 * x <= 1);
julia> constraint_string(MIME("text/plain"), c)
"c : 2 x ≤ 1"
#
JuMP.constraints_string
— Method
constraints_string(mode, model::AbstractModel)::Vector{String}
Returns a list of String
strings describing each constraint of the model.
Example
julia> model = Model();
julia> @variable(model, x >= 0);
julia> @constraint(model, c, 2 * x <= 1);
julia> constraints_string(MIME("text/plain"), model)
2-element Vector{String}:
"c : 2 x ≤ 1"
"x ≥ 0"
#
JuMP.copy_conflict
— Method
copy_conflict(model::GenericModel)
Returns a copy of the current conflict for the model
and GenericReferenceMap
, which can be used to get a reference to a variable and a constraint of the new model corresponding to a reference to the specified model `model'.
This is a convenient feature that provides filtering for copy_model
.
Note
Copying of the model is not supported in the DIRECT
mode, that is, when the model is created using the constructor. direct_model
, not the constructor Model
. In addition, regardless of whether an optimizer was provided when creating the model, the new model will not have an optimizer, meaning it needs to be provided with an optimizer in the call. optimize!
.
Example
The following example creates a model
with the variable x
and two constraints: c1
and c2'. There is no solution to this model, as the constraints are mutually exclusive. The solver is instructed to calculate the conflict using `compute_conflict!
. Then the parts of the model
involved in the conflict are copied to the iis_model
model.
julia> using JuMP
julia> import Gurobi
julia> model = Model(Gurobi.Optimizer);
julia> set_silent(model)
julia> @variable(model, x >= 0)
x
julia> @constraint(model, c1, x >= 2)
c1 : x ≥ 2
julia> @constraint(model, c2, x <= 1)
c2 : x ≤ 1
julia> optimize!(model)
julia> compute_conflict!(model)
julia> if get_attribute(model, MOI.ConflictStatus()) == MOI.CONFLICT_FOUND
iis_model, reference_map = copy_conflict(model)
print(iis_model)
end
Feasibility
Subject to
c1 : x ≥ 2
c2 : x ≤ 1
#
JuMP.copy_extension_data
— Method
copy_extension_data(data, new_model::AbstractModel, model::AbstractModel)
Returns a copy of the data of the data
extension of the `model' to the data of the extension of the new model `new_model'.
For any JuMP extension that stores data in the ext
field, add a method.
This method should only be implemented by developers who create JuMP extensions. It should never be invoked by JuMP users.
Do not engage in "type piracy" by implementing this method for |
#
JuMP.copy_model
— Method
copy_model(model::GenericModel; filter_constraints::Union{Nothing, Function}=nothing)
Returns a copy of the model
and GenericReferenceMap
, which can be used to get a reference to a variable and a constraint of the new model corresponding to a reference to the specified model model'. The method is also implemented `Base.copy(::AbstractModel)
; it is similar to copy_model
, but it does not return a link mapping.
If the filter_constraints
argument is specified, only the constraints for which this function returns true
are copied. A reference to the restriction is passed to this function as an argument.
Note
Copying of the model is not supported in the DIRECT
mode, that is, when the model is created using the constructor. direct_model
, not the constructor Model
. In addition, regardless of whether an optimizer was provided when creating the model, the new model will not have an optimizer, meaning it needs to be provided with an optimizer in the call. optimize!
.
Example
The following example creates a model
with an x
variable and a cref
constraint. It is then copied to the new_model
model with new references assigned to x_new
and `cref_new'.
julia> model = Model();
julia> @variable(model, x)
x
julia> @constraint(model, cref, x == 2)
cref : x = 2
julia> new_model, reference_map = copy_model(model);
julia> x_new = reference_map[x]
x
julia> cref_new = reference_map[cref]
cref : x = 2
#
JuMP.delete
— Method
delete(model::GenericModel, con_ref::ConstraintRef)
Removes the constraint associated with `constraint_ref' from the `model'.
Note that delete
does not cancel the registration of the name in the model, so adding a new constraint with the same name will result in an error. To cancel the registration of a deleted name, use the method unregister
.
Example
julia> model = Model();
julia> @variable(model, x);
julia> @constraint(model, c, 2x <= 1)
c : 2 x ≤ 1
julia> delete(model, c)
julia> unregister(model, :c)
julia> print(model)
Feasibility
Subject to
julia> model[:c]
ERROR: KeyError: key :c not found
Stacktrace:
[...]
#
JuMP.delete
— Method
delete(model::GenericModel, variable_ref::GenericVariableRef)
Removes the variable associated with variable_ref
from the `model'.
Note that delete
does not cancel the registration of the name in the model, so adding a new variable with the same name will result in an error. To cancel the registration of a deleted name, use the method unregister
.
Example
julia> model = Model();
julia> @variable(model, x)
x
julia> delete(model, x)
julia> unregister(model, :x)
julia> print(model)
Feasibility
Subject to
julia> model[:x]
ERROR: KeyError: key :x not found
Stacktrace:
[...]
#
JuMP.delete
— Method
delete(model::GenericModel, con_refs::Vector{<:ConstraintRef})
Removes constraints related to `con_refs' from the `model'.
Specialized methods can be implemented in solvers to remove several constraints of one specific type. These methods can be more effective than repeatedly calling the `delete' method for individual constraints.
See also the description of the method unregister
.
Example
julia> model = Model();
julia> @variable(model, x[1:3]);
julia> @constraint(model, c, 2 * x .<= 1)
3-element Vector{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.LessThan{Float64}}, ScalarShape}}:
c : 2 x[1] ≤ 1
c : 2 x[2] ≤ 1
c : 2 x[3] ≤ 1
julia> delete(model, c)
julia> unregister(model, :c)
julia> print(model)
Feasibility
Subject to
julia> model[:c]
ERROR: KeyError: key :c not found
Stacktrace:
[...]
#
JuMP.delete
— Method
delete(model::GenericModel, variable_refs::Vector{<:GenericVariableRef})
Removes variables related to `variable_refs' from the `model'. Solvers can implement methods for deleting multiple variables, which work more efficiently than repeatedly calling the deletion method for individual variables.
See also the description of the method unregister
.
Example
julia> model = Model();
julia> @variable(model, x[1:2]);
julia> delete(model, x)
julia> unregister(model, :x)
julia> print(model)
Feasibility
Subject to
julia> model[:x]
ERROR: KeyError: key :x not found
Stacktrace:
[...]
#
JuMP.delete_lower_bound
— Method
delete_lower_bound(v::GenericVariableRef)
Removes the constraint of the lower bound of the variable.
See also the description LowerBoundRef
, has_lower_bound
, lower_bound
and set_lower_bound
.
Example
julia> model = Model();
julia> @variable(model, x >= 1.0);
julia> has_lower_bound(x)
true
julia> delete_lower_bound(x)
julia> has_lower_bound(x)
false
#
JuMP.delete_upper_bound
— Method
delete_upper_bound(v::GenericVariableRef)
Removes the restriction of the upper bound of the variable.
If it does not exist, it returns an error.
See also the description UpperBoundRef
, has_upper_bound
, upper_bound
and set_upper_bound
.
Example
julia> model = Model();
julia> @variable(model, x <= 1.0);
julia> has_upper_bound(x)
true
julia> delete_upper_bound(x)
julia> has_upper_bound(x)
false
#
JuMP.direct_generic_model
— Method
direct_generic_model(
value_type::Type{T},
backend::MOI.ModelLike;
) where {T<:Real}
Returns a new JuMP model, using the backend
to save the model and solve it.
Unlike the Model
constructor](api.md#JuMP.Model), the model cache is not stored outside the `backend', and bridges are not automatically applied to the [`backend'.
Notes
The lack of a cache reduces the amount of memory used, but it is important to keep in mind the following consequences of creating models in this direct mode.
-
If the
backend
does not support an operation, such as changing constraints or adding variables or constraints after the solution, an error is returned. For models created using the constructorModel
, in such situations, you can save changes in the cache and load them into the optimizer when you calloptimize!
. -
Constraint bridges are not supported by default.
-
The optimizer used cannot be changed after creating the model.
-
The created model cannot be copied.
#
JuMP.direct_generic_model
— Method
direct_generic_model(::Type{T}, factory::MOI.OptimizerWithAttributes)
Creates a model direct_generic_model'
using `factory', the `MOI.OptimizerWithAttributes object created using optimizer_with_attributes
.
Example
julia> import HiGHS
julia> optimizer = optimizer_with_attributes(
HiGHS.Optimizer,
"presolve" => "off",
MOI.Silent() => true,
);
julia> model = direct_generic_model(Float64, optimizer)
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: DIRECT
Solver name: HiGHS
equivalent to the following:
julia> import HiGHS
julia> model = direct_generic_model(Float64, HiGHS.Optimizer())
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: DIRECT
Solver name: HiGHS
julia> set_attribute(model, "presolve", "off")
julia> set_attribute(model, MOI.Silent(), true)
#
JuMP.direct_model
— Method
direct_model(backend::MOI.ModelLike)
Returns a new JuMP model, using the backend
to save the model and solve it.
Unlike the Model
constructor](api.md#JuMP.Model), the model cache is not stored outside the `backend', and bridges are not automatically applied to the [`backend'.
Notes
The lack of a cache reduces the amount of memory used, but it is important to keep in mind the following consequences of creating models in this direct mode.
-
If the
backend
does not support an operation, such as changing constraints or adding variables or constraints after the solution, an error is returned. For models created using the constructorModel
, in such situations, you can save changes in the cache and load them into the optimizer when you calloptimize!
. -
Constraint bridges are not supported by default.
-
The optimizer used cannot be changed after creating the model.
-
The created model cannot be copied.
#
JuMP.direct_model
— Method
direct_model(factory::MOI.OptimizerWithAttributes)
Creates a model direct_model
using factory', the object `MOI.OptimizerWithAttributes
created using optimizer_with_attributes
.
Example
julia> import HiGHS
julia> optimizer = optimizer_with_attributes(
HiGHS.Optimizer,
"presolve" => "off",
MOI.Silent() => true,
);
julia> model = direct_model(optimizer)
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: DIRECT
Solver name: HiGHS
equivalent to the following:
julia> import HiGHS
julia> model = direct_model(HiGHS.Optimizer())
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: DIRECT
Solver name: HiGHS
julia> set_attribute(model, "presolve", "off")
julia> set_attribute(model, MOI.Silent(), true)
#
JuMP.drop_zeros!
— Method
drop_zeros!(expr::GenericAffExpr)
Removes terms with coefficients of 0
from an affine expression.
Example
julia> model = Model();
julia> @variable(model, x[1:2]);
julia> expr = x[1] + x[2];
julia> add_to_expression!(expr, -1.0, x[1])
0 x[1] + x[2]
julia> drop_zeros!(expr)
julia> expr
x[2]
#
JuMP.drop_zeros!
— Method
drop_zeros!(expr::GenericQuadExpr)
Removes terms with coefficients of 0
from a quadratic expression.
Example
julia> model = Model();
julia> @variable(model, x[1:2]);
julia> expr = x[1]^2 + x[2]^2;
julia> add_to_expression!(expr, -1.0, x[1], x[1])
0 x[1]² + x[2]²
julia> drop_zeros!(expr)
julia> expr
x[2]²
#
JuMP.dual
— Method
dual(con_ref::ConstraintRef; result::Int = 1)
Returns the dual value of the con_ref
constraint associated with the index of the result
of the most recent solution returned by the solver.
To check if the result exists before requesting the values, use the method has_duals
.
See also the description of the methods result_count
and shadow_price
.
Example
julia> import HiGHS
julia> model = Model(HiGHS.Optimizer);
julia> set_silent(model)
julia> @variable(model, x);
julia> @constraint(model, c, x <= 1)
c : x ≤ 1
julia> @objective(model, Max, 2 * x + 1);
julia> optimize!(model)
julia> has_duals(model)
true
julia> dual(c)
-2.0
#
JuMP.dual_objective_value
— Method
dual_objective_value(model::GenericModel; result::Int = 1)
Returns the goal value of the dual problem associated with the index of the result of the most recent solution returned by the solver.
If the solver does not support this attribute, the exception MOI.UnsupportedAttribute' is thrown.{MOI.DualObjectiveValue}
.
This function is equivalent to requesting the attribute `MOI.DualObjectiveValue'.
See also the description of the method result_count
.
Example
julia> import HiGHS
julia> model = Model(HiGHS.Optimizer);
julia> set_silent(model)
julia> @variable(model, x >= 1);
julia> @objective(model, Min, 2 * x + 1);
julia> optimize!(model)
julia> dual_objective_value(model)
3.0
julia> dual_objective_value(model; result = 2)
ERROR: Result index of attribute MathOptInterface.DualObjectiveValue(2) out of bounds. There are currently 1 solution(s) in the model.
Stacktrace:
[...]
#
JuMP.dual_shape
— Method
dual_shape(shape::AbstractShape)::AbstractShape
Returns the shape of the dual object space of the shape shape'. By default, the `dual_shape
for the shape is the shape itself. For an example of when this rule is not followed, see the examples section below.
Example
Consider the polynomial constraints for which the moment constraints are dual, and the moment constraints for which the polynomial constraints are dual. The shapes of the polynomials can be defined as follows:
struct Polynomial
coefficients::Vector{Float64}
monomials::Vector{Monomial}
end
struct PolynomialShape <: AbstractShape
monomials::Vector{Monomial}
end
JuMP.reshape_vector(x::Vector, shape::PolynomialShape) = Polynomial(x, shape.monomials)
And the shape of the moments can be defined as follows:
struct Moments
coefficients::Vector{Float64}
monomials::Vector{Monomial}
end
struct MomentsShape <: AbstractShape
monomials::Vector{Monomial}
end
JuMP.reshape_vector(x::Vector, shape::MomentsShape) = Moments(x, shape.monomials)
Then dual_shape
allows us to determine the form of duality of polynomial constraints and moment constraints.:
dual_shape(shape::PolynomialShape) = MomentsShape(shape.monomials)
dual_shape(shape::MomentsShape) = PolynomialShape(shape.monomials)
#
JuMP.dual_start_value
— Method
dual_start_value(con_ref::ConstraintRef)
Returns the dual initial value (MOI attribute `ConstraintDualStart') of the constraint `con_ref'.
If no dual initial value is set, `dual_start_value' returns `nothing'.
See also the description of the method set_dual_start_value
.
Example
julia> model = Model();
julia> @variable(model, x, start = 2.0);
julia> @constraint(model, c, [2x] in Nonnegatives())
c : [2 x] ∈ MathOptInterface.Nonnegatives(1)
julia> set_dual_start_value(c, [0.0])
julia> dual_start_value(c)
1-element Vector{Float64}:
0.0
julia> set_dual_start_value(c, nothing)
julia> dual_start_value(c)
#
JuMP.dual_status
— Method
dual_status(model::GenericModel; result::Int = 1)
Returns the object MOI.ResultStatusCode
, describing the state of the last dual solution of the solver (that is, the attribute MOI.DualStatus
) associated with the index of the result `result'.
See also the description of the method result_count
.
Example
julia> import Ipopt
julia> model = Model(Ipopt.Optimizer);
julia> dual_status(model; result = 2)
NO_SOLUTION::ResultStatusCode = 0
#
JuMP.error_if_direct_mode
— Method
error_if_direct_mode(model::GenericModel, func::Symbol)
Returns an error if model
is in live mode while calling from a function named `func'.
It is used inside JuMP or by JuMP extensions, which should not support models in direct mode.
Example
julia> import HiGHS
julia> model = direct_model(HiGHS.Optimizer());
julia> error_if_direct_mode(model, :foo)
ERROR: The `foo` function is not supported in DIRECT mode.
Stacktrace:
[...]
#
JuMP.fix
— Method
fix(v::GenericVariableRef, value::Number; force::Bool = false)
Fixes the value of a variable. Updates the locking constraint if it already exists; otherwise, it creates it.
If the variable already has bounds and force=false
, calling fix
will result in an error. If force=true', the existing boundaries of the variable will be deleted and a locking constraint will be added. Please note that after the call `unfix
the variable will have no boundaries.
Example
julia> model = Model();
julia> @variable(model, x);
julia> is_fixed(x)
false
julia> fix(x, 1.0)
julia> is_fixed(x)
true
julia> model = Model();
julia> @variable(model, 0 <= x <= 1);
julia> is_fixed(x)
false
julia> fix(x, 1.0; force = true)
julia> is_fixed(x)
true
#
JuMP.fix_discrete_variables
— Method
fix_discrete_variables([var_value::Function = value,] model::GenericModel)
Modifies the `model' so that all binary and integer variables are converted to continuous ones with fixed boundaries of `var_value(x)'.
Return value
Returns a function that can be called without arguments to restore the original model. If additional changes are made to the affected variables, the behavior of this function is undefined.
Notes
-
If there are semi-continuous or semi-integer constraints, an error occurs (support for such constraints may be added in the future).
-
All other restrictions are ignored (remain unchanged). This includes discrete constraints such as SOS and indicator constraints.
Example
julia> model = Model();
julia> @variable(model, x, Bin, start = 1);
julia> @variable(model, 1 <= y <= 10, Int, start = 2);
julia> @objective(model, Min, x + y);
julia> undo_relax = fix_discrete_variables(start_value, model);
julia> print(model)
Min x + y
Subject to
x = 1
y = 2
julia> undo_relax()
julia> print(model)
Min x + y
Subject to
y ≥ 1
y ≤ 10
y integer
x binary
#
JuMP.flatten!
— Method
flatten!(expr::GenericNonlinearExpr)
Removes the hierarchical structure of a non-linear expression in place by lifting the nested nodes +
and *
into a single n-ary operation.
Motivation
Non-linear expressions created using operator overloading may have deep nesting and be unbalanced. For example, prod(x for i in 1:4)`creates an expression `(x, *(x, *(x, x)))
instead of the more preferred option `(x, x, x, x)'.
Example
julia> model = Model();
julia> @variable(model, x)
x
julia> y = prod(x for i in 1:4)
((x²) * x) * x
julia> flatten!(y)
(x²) * x * x
julia> flatten!(sin(prod(x for i in 1:4)))
sin((x²) * x * x)
#
JuMP.function_string
— Method
function_string(
mode::MIME,
func::Union{JuMP.AbstractJuMPScalar,Vector{<:JuMP.AbstractJuMPScalar}},
)
Returns the string String
representing the function func
using the output mode `mode'.
Example
julia> model = Model();
julia> @variable(model, x);
julia> function_string(MIME("text/plain"), 2 * x + 1)
"2 x + 1"
#
JuMP.get_attribute
— Method
get_attribute(model::GenericModel, attr::MOI.AbstractModelAttribute)
get_attribute(x::GenericVariableRef, attr::MOI.AbstractVariableAttribute)
get_attribute(cr::ConstraintRef, attr::MOI.AbstractConstraintAttribute)
Returns the value of the attr
attribute related to the solver.
Equivalent to calling MOI.get
](api.md#MathOptInterface.get-Tuple{GenericModel, MathOptInterface.AbstractModelAttribute}) with the appropriate MOI model, and for variables and constraints, with the appropriate index MOI.VariableIndex
or [`MOI.ConstraintIndex'.
Example
julia> model = Model();
julia> @variable(model, x)
x
julia> @constraint(model, c, 2 * x <= 1)
c : 2 x ≤ 1
julia> get_attribute(model, MOI.Name())
""
julia> get_attribute(x, MOI.VariableName())
"x"
julia> get_attribute(c, MOI.ConstraintName())
"c"
#
JuMP.get_attribute
— Method
get_attribute(
model::Union{GenericModel,MOI.OptimizerWithAttributes},
attr::Union{AbstractString,MOI.AbstractOptimizerAttribute},
)
Returns the value of the attr
attribute related to the solver.
Equivalent to calling MOI.get
with the corresponding MOI model.
If attr
is an AbstractString
string, it is converted to MOI.RawOptimizerAttribute
.
Example
julia> import HiGHS
julia> opt = optimizer_with_attributes(HiGHS.Optimizer, "output_flag" => true);
julia> model = Model(opt);
julia> get_attribute(model, "output_flag")
true
julia> get_attribute(model, MOI.RawOptimizerAttribute("output_flag"))
true
julia> get_attribute(opt, "output_flag")
true
julia> get_attribute(opt, MOI.RawOptimizerAttribute("output_flag"))
true
#
JuMP.get_optimizer_attribute
— Method
get_optimizer_attribute(
model::Union{GenericModel,MOI.OptimizerWithAttributes},
attr::Union{AbstractString,MOI.AbstractOptimizerAttribute},
)
Returns the value associated with the solver-related attribute `attr'.
If attr
is the string AbstractString
, this method is equivalent to calling get_optimizer_attribute(model, MOI.RawOptimizerAttribute(name))
.
Compatibility
This method will remain in all releases of JuMP v1.X, but may be removed in a future v2.0 release. Instead, it is recommended to use |
See also the description of the methods set_optimizer_attribute
and set_optimizer_attributes
.
Example
julia> import Ipopt
julia> model = Model(Ipopt.Optimizer);
julia> get_optimizer_attribute(model, MOI.Silent())
false
#
JuMP.has_duals
— Method
has_duals(model::GenericModel; result::Int = 1)
Returns true
if the solver has a dual solution based on the result index available for the query; otherwise, it returns `false'.
See also the description dual
, shadow_price
and result_count
.
Example
julia> import HiGHS
julia> model = Model(HiGHS.Optimizer);
julia> set_silent(model)
julia> @variable(model, x);
julia> @constraint(model, c, x <= 1)
c : x ≤ 1
julia> @objective(model, Max, 2 * x + 1);
julia> has_duals(model)
false
julia> optimize!(model)
julia> has_duals(model)
true
#
JuMP.has_lower_bound
— Method
has_lower_bound(v::GenericVariableRef)
Returns the value true
if v
has a lower bound. If the value is true
, the lower bound can be requested using the method lower_bound
.
See also the description LowerBoundRef
, lower_bound
, set_lower_bound
and delete_lower_bound
.
Example
julia> model = Model();
julia> @variable(model, x >= 1.0);
julia> has_lower_bound(x)
true
#
JuMP.has_start_value
— Method
has_start_value(variable::AbstractVariableRef)
Returns the value true
if an initial value is set for the variable; otherwise, it returns `false'.
See also the description of the methods start_value
and set_start_value
.
Example
julia> model = Model();
julia> @variable(model, x, start = 1.5);
julia> @variable(model, y);
julia> has_start_value(x)
true
julia> has_start_value(y)
false
julia> start_value(x)
1.5
julia> set_start_value(y, 2.0)
julia> has_start_value(y)
true
julia> start_value(y)
2.0
#
JuMP.has_upper_bound
— Method
has_upper_bound(v::GenericVariableRef)
Returns the value true
if v
has an upper bound. If the value is true
, the upper bound can be requested using the method upper_bound
.
See also the description UpperBoundRef
, upper_bound
, set_upper_bound
and delete_upper_bound
.
Example
julia> model = Model();
julia> @variable(model, x <= 1.0);
julia> has_upper_bound(x)
true
#
JuMP.has_values
— Method
has_values(model::GenericModel; result::Int = 1)
Returns true
if the solver has a direct solution based on the result index available for the query; otherwise, it returns `false'.
See also the description of the methods value
and result_count
.
Example
julia> import HiGHS
julia> model = Model(HiGHS.Optimizer);
julia> set_silent(model)
julia> @variable(model, x);
julia> @constraint(model, c, x <= 1)
c : x ≤ 1
julia> @objective(model, Max, 2 * x + 1);
julia> has_values(model)
false
julia> optimize!(model)
julia> has_values(model)
true
#
JuMP.in_set_string
— Function
in_set_string(mode::MIME, set)
Returns the string String
representing membership of the set using the output mode `mode'.
Extensions
JuMP extensions can extend this method to new set
types to improve the readability of the output data.
Example
julia> in_set_string(MIME("text/plain"), MOI.Interval(1.0, 2.0))
"∈ [1, 2]"
#
JuMP.in_set_string
— Method
in_set_string(mode::MIME, constraint::AbstractConstraint)
Returns the string String
representing membership of the constraint set using the output mode `mode'.
#
JuMP.index
— Method
index(cr::ConstraintRef)::MOI.ConstraintIndex
Returns the index of the constraint corresponding to cr
in the MOI backend.
Example
julia> model = Model();
julia> @variable(model, x);
julia> @constraint(model, c, x >= 0);
julia> index(c)
MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.GreaterThan{Float64}}(1)
#
JuMP.index
— Method
index(v::GenericVariableRef)::MOI.VariableIndex
Returns the index of the variable corresponding to v
in the MOI backend.
Example
julia> model = Model();
julia> @variable(model, x);
julia> index(x)
MOI.VariableIndex(1)
#
JuMP.is_binary
— Method
is_binary(v::GenericVariableRef)
Returns the value true
if the variable v
is limited to the binary form.
See also the description BinaryRef
, set_binary
and unset_binary
.
Example
julia> model = Model();
julia> @variable(model, x, Bin);
julia> is_binary(x)
true
#
JuMP.is_fixed
— Method
is_fixed(v::GenericVariableRef)
Returns the value true
if v' is a fixed variable. If the value is `true
, a fixed value can be requested using the method fix_value
.
Example
julia> model = Model();
julia> @variable(model, x);
julia> is_fixed(x)
false
julia> fix(x, 1.0)
julia> is_fixed(x)
true
#
JuMP.is_integer
— Method
is_integer(v::GenericVariableRef)
Returns the value true
if the variable v
is limited to an integer form.
See also the description IntegerRef
, set_integer
and unset_integer
.
Example
julia> model = Model();
julia> @variable(model, x);
julia> is_integer(x)
false
julia> set_integer(x)
julia> is_integer(x)
true
#
JuMP.is_parameter
— Method
is_parameter(x::GenericVariableRef)::Bool
Returns the value true
if x
should be a parameter.
See also the description ParameterRef
, set_parameter_value
and parameter_value
.
Example
julia> model = Model();
julia> @variable(model, p in Parameter(2))
p
julia> is_parameter(p)
true
julia> @variable(model, x)
x
julia> is_parameter(x)
false
#
JuMP.is_solved_and_feasible
— Method
is_solved_and_feasible(
model::GenericModel;
allow_local::Bool = true,
allow_almost::Bool = false,
dual::Bool = false,
result::Int = 1,
)
Returns the value true
if the model has a valid direct solution associated with the result index result', and `termination_status
has a value OPTIMAL
(the solver has found a global optimum) or LOCALLY_SOLVED
(the solver found a local optimum, which can also be global, but the solver failed to prove this).
When allow_local = false
, this function returns the value true' only if `termination_status
has a value OPTIMAL
.
When allow_almost = true
, the method termination_status
can also return the value ALMOST_OPTIMAL
or ALMOST_LOCALLY_SOLVED
(with allow_local'), and the methods `primal_status
and dual_status
can also return a value NEARLY_FEASIBLE_POINT
.
If the argument is dual
It is set to true. Additionally, the presence of an optimal dual solution is checked.
If this function returns the value false', use the methods `termination_status
, result_count
, `primal_status' and `dual_status' to determine the available solutions (if any).
Example
julia> import Ipopt
julia> model = Model(Ipopt.Optimizer);
julia> is_solved_and_feasible(model)
false
#
JuMP.is_valid
— Method
is_valid(model::GenericModel, con_ref::ConstraintRef{<:AbstractModel})
Returns the value true
if con_ref' refers to a valid constraint in `model
.
Example
julia> model = Model();
julia> @variable(model, x);
julia> @constraint(model, c, 2 * x <= 1);
julia> is_valid(model, c)
true
julia> model_2 = Model();
julia> is_valid(model_2, c)
false
#
JuMP.is_valid
— Method
is_valid(model::GenericModel, variable_ref::GenericVariableRef)
Returns the value true
if variable
refers to a valid variable in `model'.
Example
julia> model = Model();
julia> @variable(model, x);
julia> is_valid(model, x)
true
julia> model_2 = Model();
julia> is_valid(model_2, x)
false
#
JuMP.isequal_canonical
— Function
isequal_canonical(
x::T,
y::T
) where {T<:AbstractJuMPScalar,AbstractArray{<:AbstractJuMPScalar}}
Returns the value true
if x
is equal to y
after excluding zeros and ignoring the order.
This method is mainly useful for testing because fallbacks such as x ==y
do not take into account valid mathematical comparisons, such as x[1] + 0 x[2] + 1 == x[1] + 1
.
Example
julia> model = Model();
julia> @variable(model, x[1:2]);
julia> a = x[1] + 1.0
x[1] + 1
julia> b = x[1] + x[2] + 1.0
x[1] + x[2] + 1
julia> add_to_expression!(b, -1.0, x[2])
x[1] + 0 x[2] + 1
julia> a == b
false
julia> isequal_canonical(a, b)
true
#
JuMP.jump_function
— Function
jump_function(model::AbstractModel, x::MOI.AbstractFunction)
For a given MathOptInterface object, 'x` returns the equivalent of JuMP.
See also the function description moi_function
.
Example
julia> model = Model();
julia> @variable(model, x);
julia> f = 2.0 * index(x) + 1.0
1.0 + 2.0 MOI.VariableIndex(1)
julia> jump_function(model, f)
2 x + 1
#
JuMP.jump_function
— Method
jump_function(constraint::AbstractConstraint)
Returns the constraint function in the form of "function in a set" in the form of "AbstractJuMPScalar" or Vector{AbstractJuMPScalar}
.
#
JuMP.jump_function_type
— Function
jump_function_type(model::AbstractModel, ::Type{T}) where {T}
For a given object type, MathOptInterface T
returns the equivalent of JuMP.
See also the function description moi_function_type
.
Example
julia> model = Model();
julia> jump_function_type(model, MOI.ScalarAffineFunction{Float64})
AffExpr (alias for GenericAffExpr{Float64, GenericVariableRef{Float64}})
#
JuMP.latex_formulation
— Method
latex_formulation(model::AbstractModel)
Encloses the model model
in a type, which allows it to be formatted as text/latex
in a notebook, for example IJulia, or in Documenter.
To output the model, end the cell by calling latex_formulation(model)
or calling display(latex_formulation(model))
to display the model from inside the function.
#
JuMP.linear_terms
— Method
linear_terms(aff::GenericAffExpr{C,V})
Provides an iterator over tuples of coefficients and variables (a_i::C, x_i::V)
in the linear part of an affine expression.
#
JuMP.linear_terms
— Method
linear_terms(quad::GenericQuadExpr{C,V})
Provides an iterator over tuples of (efficient::C, variable::V)
in the linear part of a quadratic expression.
#
JuMP.list_of_constraint_types
— Method
list_of_constraint_types(model::GenericModel)::Vector{Tuple{Type,Type}}
Returns a list of tuples of the form (F, S)
, where F
is the type of the JuMP function, and S
is the type of the set MOI such that all_constraints(model, F, S)
returns a nonempty list.
Example
julia> model = Model();
julia> @variable(model, x >= 0, Bin);
julia> @constraint(model, 2x <= 1);
julia> list_of_constraint_types(model)
3-element Vector{Tuple{Type, Type}}:
(AffExpr, MathOptInterface.LessThan{Float64})
(VariableRef, MathOptInterface.GreaterThan{Float64})
(VariableRef, MathOptInterface.ZeroOne)
Performance Notes
Iterating through a list of types of functions and sets is a type-unstable operation. You can resort to a functional barrier. For more information, see the Performance tips for extensions section of the documentation.
#
JuMP.lower_bound
— Method
lower_bound(v::GenericVariableRef)
Returns the lower bound of the variable. If it does not exist, it returns an error.
See also the description LowerBoundRef
, has_lower_bound
, set_lower_bound
and delete_lower_bound
.
Example
julia> model = Model();
julia> @variable(model, x >= 1.0);
julia> lower_bound(x)
1.0
#
JuMP.lp_matrix_data
— Method
lp_matrix_data(model::GenericModel{T})
For this model, the JuMP linear program returns the structure LPMatrixData{T}
, which stores data for an equivalent linear program in the following form:
where the elements of x
can be continuous, integer, or binary variables.
Fields
The structure returned by the method lp_matrix_data
, has the following fields.
-
A::SparseArrays.SparseMatrixCSC{T,Int}
: a constraint matrix in the form of a sparse matrix. -
b_lower::Vector{T}
: dense vector of lower line boundaries. If it is missing, the valuetypemin(T)
is used. -
b_upper::Vector{T}
: dense vector of upper line boundaries. If it is missing, the valuetypemax(T)
is used. -
x_lower::Vector{T}
: dense vector of lower bounds of variables. If it is missing, the valuetypemin(T)
is used. -
x_upper::Vector{T}
: dense vector of upper bounds of variables. If it is missing, the valuetypemax(T)
is used. -
c::Vector{T}
: a dense vector of coefficients of the linear objective function. -
'c_offset::T`: a free member of the objective function.
-
`sense::MOI.OptimizationSense': the intended purpose of the model.
-
integers::Vector{Int}
: A sorted list of column indexes that are integer variables. -
binaries::Vector{Int}
: A sorted list of column indexes that are binary variables. -
variables::Vector{GenericVariableRef{T}}
: vectorGenericVariableRef
, corresponding to the column order in matrix form. -
affine_constraints::Vector{ConstraintRef}
: vectorConstraintRef
, corresponding to the row order in the matrix form.
Restrictions
Models supported 'lp_matrix_data`, intentionally limited to linear programs.
Example
julia> model = Model();
julia> @variable(model, x[1:2] >= 0);
julia> @constraint(model, x[1] + 2 * x[2] <= 1);
julia> @objective(model, Max, x[2]);
julia> data = lp_matrix_data(model);
julia> data.A
1×2 SparseArrays.SparseMatrixCSC{Float64, Int64} with 2 stored entries:
1.0 2.0
julia> data.b_lower
1-element Vector{Float64}:
-Inf
julia> data.b_upper
1-element Vector{Float64}:
1.0
julia> data.x_lower
2-element Vector{Float64}:
0.0
0.0
julia> data.x_upper
2-element Vector{Float64}:
Inf
Inf
julia> data.c
2-element Vector{Float64}:
0.0
1.0
julia> data.c_offset
0.0
julia> data.sense
MAX_SENSE::OptimizationSense = 1
#
JuMP.lp_sensitivity_report
— Method
lp_sensitivity_report(model::GenericModel{T}; atol::T = Base.rtoldefault(T))::SensitivityReport{T} where {T}
For this linear program, the model' with the current optimal basis returns the object `SensitivityReport
, which matches the following:
-
each reference to a variable with the tuple
(d_lo, d_hi)::Tuple{T,T}
, explaining how much the target coefficient of the corresponding variable can change so that the initial basis remains optimal; -
each constraint reference with the tuple
(d_lo, d_hi)::Tuple{T,T}
, explaining how much the right-hand side of the corresponding constraint can change so that the basis remains optimal.
Both tuples are relative, not absolute. Thus, if the target coefficient is 1.0
and the tuple is (-0.5, 0.5)
, the target coefficient can range from 1.0 - 0.5
to `1.0 + 0.5'.
'atol` is the tolerance of direct dual optimality, which must match the tolerance of the solver used to calculate the basis.
Please note: Interval limits are NOT supported.
Example
julia> import HiGHS
julia> model = Model(HiGHS.Optimizer);
julia> set_silent(model)
julia> @variable(model, -1 <= x <= 2)
x
julia> @objective(model, Min, x)
x
julia> optimize!(model)
julia> report = lp_sensitivity_report(model; atol = 1e-7);
julia> dx_lo, dx_hi = report[x]
(-1.0, Inf)
julia> println(
"The objective coefficient of `x` can decrease by $dx_lo or " *
"increase by $dx_hi."
)
The objective coefficient of `x` can decrease by -1.0 or increase by Inf.
julia> dRHS_lo, dRHS_hi = report[LowerBoundRef(x)]
(-Inf, 3.0)
julia> println(
"The lower bound of `x` can decrease by $dRHS_lo or increase " *
"by $dRHS_hi."
)
The lower bound of `x` can decrease by -Inf or increase by 3.0.
#
JuMP.map_coefficients
— Method
map_coefficients(f::Function, a::GenericAffExpr)
Applies the function f
to the coefficients and the free term of the expression GenericAffExpr
a
and returns a new expression.
See also the description of the method map_coefficients_inplace!
.
Example
julia> model = Model();
julia> @variable(model, x);
julia> a = GenericAffExpr(1.0, x => 1.0)
x + 1
julia> map_coefficients(c -> 2 * c, a)
2 x + 2
julia> a
x + 1
#
JuMP.map_coefficients
— Method
map_coefficients(f::Function, a::GenericQuadExpr)
Applies the function f
to the coefficients and the free term of the expression GenericQuadExpr
a
and returns a new expression.
See also the description of the method map_coefficients_inplace!
.
Example
julia> model = Model();
julia> @variable(model, x);
julia> a = @expression(model, x^2 + x + 1)
x² + x + 1
julia> map_coefficients(c -> 2 * c, a)
2 x² + 2 x + 2
julia> a
x² + x + 1
#
JuMP.map_coefficients_inplace!
— Method
map_coefficients_inplace!(f::Function, a::GenericAffExpr)
Applies the function f
to the coefficients and the free term of the expression GenericAffExpr
a
and updates them in place.
See also the description of the method map_coefficients
.
Example
julia> model = Model();
julia> @variable(model, x);
julia> a = GenericAffExpr(1.0, x => 1.0)
x + 1
julia> map_coefficients_inplace!(c -> 2 * c, a)
2 x + 2
julia> a
2 x + 2
#
JuMP.map_coefficients_inplace!
— Method
map_coefficients_inplace!(f::Function, a::GenericQuadExpr)
Applies the function f
to the coefficients and the free term of the expression GenericQuadExpr
a
and updates them in place.
See also the description of the method map_coefficients
.
Example
julia> model = Model();
julia> @variable(model, x);
julia> a = @expression(model, x^2 + x + 1)
x² + x + 1
julia> map_coefficients_inplace!(c -> 2 * c, a)
2 x² + 2 x + 2
julia> a
2 x² + 2 x + 2
#
JuMP.model_convert
— Method
model_convert(
model::AbstractModel,
rhs::Union{
AbstractConstraint,
Number,
AbstractJuMPScalar,
MOI.AbstractSet,
},
)
Converts coefficients and constants of functions and sets in rhs' to the coefficient type `value_type(typeof(model))
.
Purpose
Creating and adding a constraint is a two-step process. At the first stage, it is called build_constraint
, and the result is passed to add_constraint
.
However, since the method build_constraint
does not accept model
as an argument, the coefficients and constants of a function or set may differ from value_type(typeof(model))
.
Therefore, the result is build_constraint
is converted in the model_convert
call before the result is passed to add_constraint
.
#
JuMP.model_string
— Method
model_string(mode::MIME, model::AbstractModel)
Returns the model
representation as a String
for the `mode' mode.
Example
julia> model = Model();
julia> @variable(model, x >= 0);
julia> print(model_string(MIME("text/plain"), model))
Feasibility
Subject to
x ≥ 0
#
JuMP.moi_function
— Function
moi_function(x::AbstractJuMPScalar)
moi_function(x::AbstractArray{<:AbstractJuMPScalar})
For this JuMP object, 'x` returns the equivalent of MathOptInterface.
See also the function description jump_function
.
Example
julia> model = Model();
julia> @variable(model, x);
julia> f = 2.0 * x + 1.0
2 x + 1
julia> moi_function(f)
1.0 + 2.0 MOI.VariableIndex(1)
#
JuMP.moi_function
— Method
moi_function(constraint::AbstractConstraint)
Returns the constraint function constraint
in the form of a 'function in a set` in the form of 'MathOptInterface.AbstractFunction'.
#
JuMP.moi_function_type
— Function
moi_function_type(::Type{T}) where {T}
For this type of object, JuMP T
returns the equivalent of MathOptInterface.
See also the function description jump_function_type
.
Example
julia> moi_function_type(AffExpr)
MathOptInterface.ScalarAffineFunction{Float64}
#
JuMP.moi_set
— Method
moi_set(constraint::AbstractConstraint)
Returns a constraint set in the form of a "function in a set" in the form of a "MathOptInterface.AbstractSet".
moi_set(s::AbstractVectorSet, dim::Int)
Returns the MOI set of dimension dim
corresponding to the JuMP s
set.
moi_set(s::AbstractScalarSet)
Returns the MOI set corresponding to the JuMP s
set.
#
JuMP.name
— Method
name(model::AbstractModel)
Returns the MOI' attribute.The name of the `backend
of the model
backend, or the default value if it is empty.
Example
julia> model = Model();
julia> name(model)
"A JuMP Model"
#
JuMP.name
— Method
name(v::GenericVariableRef)::String
Returns an attribute of the variable name.
Example
julia> model = Model();
julia> @variable(model, x[1:2])
2-element Vector{VariableRef}:
x[1]
x[2]
julia> name(x[1])
"x[1]"
#
JuMP.name
— Method
name(con_ref::ConstraintRef)
Returns an attribute of the constraint name.
Example
julia> model = Model();
julia> @variable(model, x);
julia> @constraint(model, c, [2x] in Nonnegatives())
c : [2 x] ∈ MathOptInterface.Nonnegatives(1)
julia> name(c)
"c"
#
JuMP.node_count
— Method
node_count(model::GenericModel)
Returns the total number of nodes of the branch tree and boundaries explored during the last optimization in a partially integer program (the MOI' attribute.nodeCount
), if available.
If this attribute is not implemented by the solver, it returns the error `MOI.GetAttributeNotAllowed'.
Example
julia> import HiGHS
julia> model = Model(HiGHS.Optimizer);
julia> set_silent(model)
julia> optimize!(model)
julia> node_count(model)
0
#
JuMP.nonlinear_constraint_string
— Method
nonlinear_constraint_string(
model::GenericModel,
mode::MIME,
c::_NonlinearConstraint,
)
Returns a string representation of the nonlinear constraint c
belonging to `model' for the `mode' mode.
Compatibility
This feature is part of an outdated non-linear interface. It is recommended to use a new non-linear interface, which is described in the section Nonlinear Modeling. |
#
JuMP.nonlinear_dual_start_value
— Method
nonlinear_dual_start_value(model::Model)
Returns the current value of the MOI attribute `MOI.NLPBlockDualStart'.
Compatibility
This feature is part of an outdated non-linear interface. It is recommended to use a new non-linear interface, which is described in the section Nonlinear Modeling. |
#
JuMP.nonlinear_expr_string
— Method
nonlinear_expr_string(
model::GenericModel,
mode::MIME,
c::MOI.Nonlinear.Expression,
)
Returns the string representation of the nonlinear expression c
belonging to `model' for the `mode' mode.
Compatibility
This feature is part of an outdated non-linear interface. It is recommended to use a new non-linear interface, which is described in the section Nonlinear Modeling. |
#
JuMP.nonlinear_model
— Method
nonlinear_model(
model::GenericModel;
force::Bool = false,
)::Union{MOI.Nonlinear.Model,Nothing}
If the model
has non-linear components, returns MOI.Nonlinear.Model
; otherwise returns `nothing'.
If force
is set to true, the MOI.Nonlinear' object is always returned.Model
, and if it does not exist for the model, an empty such object is created.
Compatibility
This feature is part of an outdated non-linear interface. It is recommended to use a new non-linear interface, which is described in the section Nonlinear Modeling. |
#
JuMP.normalized_coefficient
— Method
normalized_coefficient(
constraint::ConstraintRef,
variable_1::GenericVariableRef,
variable_2::GenericVariableRef,
)
Returns the quadratic coefficient associated with variable_1
and variable_2
in the constraint, which was normalized by JuMP to the standard form.
See also the description of the method set_normalized_coefficient
.
Example
julia> model = Model();
julia> @variable(model, x[1:2]);
julia> @constraint(model, con, 2x[1]^2 + 3 * x[1] * x[2] + x[2] <= 2)
con : 2 x[1]² + 3 x[1]*x[2] + x[2] ≤ 2
julia> normalized_coefficient(con, x[1], x[1])
2.0
julia> normalized_coefficient(con, x[1], x[2])
3.0
julia> @constraint(model, con_vec, x.^2 <= [1, 2])
con_vec : [x[1]² - 1, x[2]² - 2] ∈ MathOptInterface.Nonpositives(2)
julia> normalized_coefficient(con_vec, x[1], x[1])
1-element Vector{Tuple{Int64, Float64}}:
(1, 1.0)
julia> normalized_coefficient(con_vec, x[1], x[2])
Tuple{Int64, Float64}[]
#
JuMP.normalized_coefficient
— Method
normalized_coefficient(
constraint::ConstraintRef,
variable::GenericVariableRef,
)
Returns the coefficient associated with the variable
in the `constraint', which has been normalized by JuMP to the standard form.
See also the description of the method set_normalized_coefficient
.
Example
julia> model = Model();
julia> @variable(model, x)
x
julia> @constraint(model, con, 2x + 3x <= 2)
con : 5 x ≤ 2
julia> normalized_coefficient(con, x)
5.0
julia> @constraint(model, con_vec, [x, 2x + 1, 3] >= 0)
con_vec : [x, 2 x + 1, 3] ∈ MathOptInterface.Nonnegatives(3)
julia> normalized_coefficient(con_vec, x)
2-element Vector{Tuple{Int64, Float64}}:
(1, 1.0)
(2, 2.0)
#
JuMP.normalized_rhs
— Method
normalized_rhs(constraint::ConstraintRef)
Returns a member from the right side of the constraint, which was transformed by JuMP into a normalized form.
See also the description of the method set_normalized_rhs
.
Example
julia> model = Model();
julia> @variable(model, x);
julia> @constraint(model, con, 2x + 1 <= 2)
con : 2 x ≤ 1
julia> normalized_rhs(con)
1.0
#
JuMP.num_constraints
— Method
num_constraints(model::GenericModel, function_type, set_type)::Int64
Returns the number of constraints that currently exist in the model, where the function is of type function_type
and the set is of type `set_type'.
See also the description of the methods list_of_constraint_types
and all_constraints
.
Example
julia> model = Model();
julia> @variable(model, x >= 0, Bin);
julia> @variable(model, y);
julia> @constraint(model, y in MOI.GreaterThan(1.0));
julia> @constraint(model, y <= 1.0);
julia> @constraint(model, 2x <= 1);
julia> num_constraints(model, VariableRef, MOI.GreaterThan{Float64})
2
julia> num_constraints(model, VariableRef, MOI.ZeroOne)
1
julia> num_constraints(model, AffExpr, MOI.LessThan{Float64})
2
#
JuMP.num_constraints
— Method
num_constraints(model::GenericModel; count_variable_in_set_constraints::Bool)
Returns the number of constraints in the `model'.
When count_variable_in_set_constraints == true
, VariableRef
constraints are enabled, such as “VariableRef` in `Integer”. To count only structural constraints (for example, rows in the constraint matrix of a linear program), pass `count_variable_in_set_constraints = false'.
Example
julia> model = Model();
julia> @variable(model, x >= 0, Int);
julia> @constraint(model, 2x <= 1);
julia> num_constraints(model; count_variable_in_set_constraints = true)
3
julia> num_constraints(model; count_variable_in_set_constraints = false)
1
#
JuMP.num_nonlinear_constraints
— Method
num_nonlinear_constraints(model::GenericModel)
Returns the number of non-linear constraints associated with the `model'.
Compatibility
This feature is part of an outdated non-linear interface. It is recommended to use a new non-linear interface, which is described in the section Nonlinear Modeling. |
This function counts only those restrictions that were added using @NLconstraint
and add_nonlinear_constraint
. She doesn’t count the limits GenericNonlinearExpr
.
#
JuMP.num_variables
— Method
num_variables(model::GenericModel)::Int64
Returns the number of variables in the `model'.
Example
julia> model = Model();
julia> @variable(model, x[1:2]);
julia> num_variables(model)
2
#
JuMP.object_dictionary
— Method
object_dictionary(model::GenericModel)
Returns a dictionary in which the symbolic name of a variable, constraint, or expression is mapped to the corresponding object.
Objects are associated with certain symbols in macros. For example, @variable(model, x[1:2, 1:2])
binds an array of variables x
to the symbol :x
.
This method must be defined for any subtype of `AbstractModel'.
See also the description of the method unregister
.
Example
julia> model = Model();
julia> @variable(model, x[1:2]);
julia> object_dictionary(model)
Dict{Symbol, Any} with 1 entry:
:x => VariableRef[x[1], x[2]]
#
JuMP.objective_bound
— Method
objective_bound(model::GenericModel)
Returns the best known bound of the optimal target value after calling optimize!(model)
.
For objective functions with a scalar value, this function returns Float64'. For objective functions with a vector value, it returns `+Vector{Float64}+
.
In the case of an objective function with a vector value, an ideal point_ is returned, that is, the point obtained by independently optimizing each objective function.
This function is equivalent to requesting the MOI.ObjectiveBound
attribute.
Example
julia> import HiGHS
julia> model = Model(HiGHS.Optimizer);
julia> set_silent(model)
julia> @variable(model, x >= 1, Int);
julia> @objective(model, Min, 2 * x + 1);
julia> optimize!(model)
julia> objective_bound(model)
3.0
#
JuMP.objective_function
— Method
objective_function(
model::GenericModel,
::Type{F} = objective_function_type(model),
) where {F}
Returns an object of type F
representing the target function.
Returns an error if the target function cannot be converted to type `F'.
This function is equivalent to requesting the attribute MOI.ObjectiveFunction{F}
.
Example
julia> model = Model();
julia> @variable(model, x)
x
julia> @objective(model, Min, 2x + 1)
2 x + 1
julia> objective_function(model, AffExpr)
2 x + 1
julia> objective_function(model, QuadExpr)
2 x + 1
julia> typeof(objective_function(model, QuadExpr))
QuadExpr (alias for GenericQuadExpr{Float64, GenericVariableRef{Float64}})
It can be seen from the last two commands that even if the objective function is affine, it can be queried as a quadratic function, since it can be transformed into it and the result will be quadratic.
However, it cannot be converted to a variable.:
julia> objective_function(model, VariableRef)
ERROR: InexactError: convert(MathOptInterface.VariableIndex, 1.0 + 2.0 MOI.VariableIndex(1))
[...]
#
JuMP.objective_function_string
— Method
objective_function_string(mode, model::AbstractModel)::String
Returns a string describing the objective function of the model.
Example
julia> model = Model();
julia> @variable(model, x);
julia> @objective(model, Min, 2 * x);
julia> objective_function_string(MIME("text/plain"), model)
"2 x"
#
JuMP.objective_function_type
— Method
objective_function_type(model::GenericModel)::AbstractJuMPScalar
Returns the type of the target function.
This function is equivalent to requesting the attribute `MOI.ObjectiveFunctionType'.
Example
julia> model = Model();
julia> @variable(model, x);
julia> @objective(model, Min, 2 * x + 1);
julia> objective_function_type(model)
AffExpr (alias for GenericAffExpr{Float64, GenericVariableRef{Float64}})
#
JuMP.objective_sense
— Method
objective_sense(model::GenericModel)::MOI.OptimizationSense
Returns the target destination.
This function is equivalent to requesting the attribute `MOI.ObjectiveSense'.
Example
julia> model = Model();
julia> objective_sense(model)
FEASIBILITY_SENSE::OptimizationSense = 2
julia> @variable(model, x);
julia> @objective(model, Max, x)
x
julia> objective_sense(model)
MAX_SENSE::OptimizationSense = 1
#
JuMP.objective_value
— Method
objective_value(model::GenericModel; result::Int = 1)
Returns the target value associated with the index of the result of the most recent solution returned by the solver.
For objective functions with a scalar value, this function returns Float64'. For objective functions with a vector value, it returns `+Vector{Float64}+
.
This function is equivalent to requesting the attribute `MOI.ObjectiveValue'.
See also the description of the method result_count
.
Example
julia> import HiGHS
julia> model = Model(HiGHS.Optimizer);
julia> set_silent(model)
julia> @variable(model, x >= 1);
julia> @objective(model, Min, 2 * x + 1);
julia> optimize!(model)
julia> objective_value(model)
3.0
julia> objective_value(model; result = 2)
ERROR: Result index of attribute MathOptInterface.ObjectiveValue(2) out of bounds. There are currently 1 solution(s) in the model.
Stacktrace:
[...]
#
JuMP.op_ifelse
— Method
op_ifelse(a, x, y)
A function that defaults to ifelse(a, x, y)
, but returns when called with variables or the JuMP expression in the first argument. GenericNonlinearExpr
.
Example
julia> model = Model();
julia> @variable(model, x);
julia> op_ifelse(true, 1.0, 2.0)
1.0
julia> op_ifelse(x, 1.0, 2.0)
ifelse(x, 1.0, 2.0)
julia> op_ifelse(true, x, 2.0)
x
#
JuMP.op_string
— Method
op_string(mime::MIME, x::GenericNonlinearExpr, ::Val{op}) where {op}
Returns the string that should be output for the op
operator when called function_string
with mime
and `x'.
Example
julia> model = Model();
julia> @variable(model, x[1:2], Bin);
julia> f = @expression(model, x[1] || x[2]);
julia> op_string(MIME("text/plain"), f, Val(:||))
"||"
#
JuMP.operator_to_set
— Method
operator_to_set(error_fn::Function, ::Val{sense_symbol})
Converts the destination character to a set so that @constraint(model, func sense_symbol 0)
is equivalent to @constraint(model, func in set)
for any `func::AbstractJuMPScalar'.
Example
After defining a custom set, you can directly create a JuMP constraint with it.:
julia> struct CustomSet{T} <: MOI.AbstractScalarSet
value::T
end
julia> Base.copy(x::CustomSet) = CustomSet(x.value)
julia> model = Model();
julia> @variable(model, x)
x
julia> cref = @constraint(model, x in CustomSet(1.0))
x ∈ CustomSet{Float64}(1.0)
However, there may be a suitable sign that can be used to provide a more convenient syntax.:
julia> JuMP.operator_to_set(::Function, ::Val{:⊰}) = CustomSet(0.0)
julia> MOIU.supports_shift_constant(::Type{<:CustomSet}) = true
julia> MOIU.shift_constant(set::CustomSet, value) = CustomSet(set.value + value)
julia> cref = @constraint(model, x ⊰ 1)
x ∈ CustomSet{Float64}(1.0)
Note that first the entire function is transferred to the right side, and then the sign is converted to a set with a zero constant, and finally the constant is transferred to the set using `MOIU.shift_constant'.
#
JuMP.operator_warn
— Method
operator_warn(model::AbstractModel)
operator_warn(model::GenericModel)
This function is called for the model whenever two affine expressions are added together without using `destructive_add!`and at least one of them contains more than 50 members.
In the case of `Model', if this function is called more than 20,000 times, the warning is generated once.
This method should only be implemented by developers who create JuMP extensions. It should never be invoked by JuMP users.
#
JuMP.optimize!
— Method
optimize!(
model::GenericModel;
ignore_optimize_hook = (model.optimize_hook === nothing),
kwargs...,
)
Optimizes the model.
If the optimizer has not been set yet (see the description of the method set_optimizer
), an error is returned NoOptimizer
.
If ignore_optimize_hook == true
, the optimization handler is ignored and the model is solved as if it had not been set. Named arguments kwargs' are passed to `optimize_hook'. If `optimize_hook
has the value `nothing' and named arguments are passed, an error is returned.
Example
julia> import HiGHS
julia> model = Model(HiGHS.Optimizer);
julia> set_silent(model)
julia> function my_optimize_hook(model; foo)
println("Hook called with foo = ", foo)
return optimize!(model; ignore_optimize_hook = true)
end
my_optimize_hook (generic function with 1 method)
julia> set_optimize_hook(model, my_optimize_hook)
my_optimize_hook (generic function with 1 method)
julia> optimize!(model; foo = 2)
Hook called with foo = 2
#
JuMP.optimizer_index
— Method
optimizer_index(x::GenericVariableRef)::MOI.VariableIndex
optimizer_index(x::ConstraintRef{<:GenericModel})::MOI.ConstraintIndex
Returns the index of a variable or constraint corresponding to x
in the associated model unsafe_backend(owner_model(x))
.
This function should be used with the method unsafe_backend
.
A safer alternative is to use backend
and index
](api.md#JuMP.index-Tuple{ConstraintRef}). For more information, see the docstring of the [backend
and unsafe_backend
.
Exceptions
-
If the optimizer is not specified, it throws an exception.
NoOptimizer
. -
If the optimizer is set but not attached, it throws an exception `ErrorException'.
-
If the index is connected via a bridge, it throws an exception `ErrorException'.
Example
julia> import HiGHS
julia> model = Model(HiGHS.Optimizer);
julia> set_silent(model)
julia> @variable(model, x >= 0)
x
julia> MOI.Utilities.attach_optimizer(model)
julia> highs = unsafe_backend(model)
A HiGHS model with 1 columns and 0 rows.
julia> optimizer_index(x)
MOI.VariableIndex(1)
#
JuMP.optimizer_with_attributes
— Method
optimizer_with_attributes(optimizer_constructor, attrs::Pair...)
Combines the optimizer constructor with a list of attrs
attributes. Note that this is equivalent to `MOI.OptimizerWithAttributes'.
When passed to the Model
constructor or to 'set_optimizer` creates an optimizer by calling optimizer_constructor()', and then sets attributes using `set_attribute
.
See also the description of the methods set_attribute
and get_attribute
.
Note
The string attribute names are special for each solver. The necessary attributes can be found in the documentation for the solver.
Example
julia> import HiGHS
julia> optimizer = optimizer_with_attributes(
HiGHS.Optimizer, "presolve" => "off", MOI.Silent() => true,
);
julia> model = Model(optimizer);
equivalent to the following:
julia> import HiGHS
julia> model = Model(HiGHS.Optimizer);
julia> set_attribute(model, "presolve", "off")
julia> set_attribute(model, MOI.Silent(), true)
#
JuMP.owner_model
— Function
owner_model(s::AbstractJuMPScalar)
Returns the model that the scalar value s
belongs to.
Example
julia> model = Model();
julia> @variable(model, x);
julia> owner_model(x) === model
true
#
JuMP.owner_model
— Method
owner_model(v::AbstractVariableRef)
Returns the model that v
belongs to.
Example
julia> model = Model();
julia> x = @variable(model)
_[1]
julia> owner_model(x) === model
true
#
JuMP.owner_model
— Method
owner_model(con_ref::ConstraintRef)
Returns the model that con_ref
belongs to.
#
JuMP.parameter_value
— Method
parameter_value(x::GenericVariableRef)
Returns the value of the x
parameter.
If x
is not a parameter, it returns an error.
See also the description ParameterRef
, is_parameter
and set_parameter_value
.
Example
julia> model = Model();
julia> @variable(model, p in Parameter(2))
p
julia> parameter_value(p)
2.0
julia> set_parameter_value(p, 2.5)
julia> parameter_value(p)
2.5
#
JuMP.parse_constraint
— Method
parse_constraint(error_fn::Function, expr::Expr)
The entry point for all actions related to constraint analysis.
Arguments
-
The
error_fn
function is passed everywhere to provide more accurate error messages. -
The
expr
is obtained from the macro `@constraint'. There are two possible options:-
@constraint(model, expr)
-
@constraint(model, name[args], expr)
In both cases,expr
is the main component of the constraint.
-
Supported syntax
JuMP currently supports the following expr
objects:
-
lhs <= rhs
-
lhs == rhs
-
lhs >= rhs
-
l <= body <= u
-
u >= body >= l
-
lhs ⟂ rhs
-
lhs in rhs
-
lhs ∈ rhs
-
z --> {constraint}
-
!z --> {constraint}
-
z <--> {constraint}
-
!z <--> {constraint}
-
z => {constraint}
-
!z => {constraint}
as well as all the broadcast options.
Extensions
The infrastructure on which the 'parse_constraint` method is based is extensible. For more information, see the description parse_constraint_head
and parse_constraint_call
.
#
JuMP.parse_constraint_call
— Method
parse_constraint_call(
error_fn::Function,
vectorized::Bool,
::Val{op},
lhs,
rhs,
) where {op}
Backup handler for binary operators. These can be infix operators, such as @constraint(model, lhs op rhs)
, or normal operators, such as @constraint(model, op(lhs, rhs))
.
In both cases, the record form will be converted to lhs - rhs in operator_to_set(error_fn, op)
.
For more information, see the method description. operator_to_set
.
#
JuMP.parse_constraint_call
— Method
parse_constraint_call(
error_fn::Function,
is_vectorized::Bool,
::Val{op},
args...,
)
Implement this method to intercept the operations of analyzing the expression :call
using the operator op
.
Expanding the constraint macro during analysis is a complex operation and may affect the existing JuMP syntax. Before publishing the code that implements these methods, discuss the possible consequences in https://gitter.im/JuliaOpt/jump-dev [developer chat]. |
Arguments
-
error_fn': A function that accepts a `String
and outputs a string error with descriptive information about the macro from which it was issued. -
is_vectorized
: A boolean value indicating whether to broadcast `op'. -
op
: the first element of the.args
field of the intercepted expression `Expr'. -
args...
: the '.args` field of theExpr
expression.
Return values
This function should return the following.
-
parse_code::Expr
: an expression containing the configuration or rewrite code to be called beforebuild_constraint
; -
build_code::Expr
: an expression that callsbuild_constraint(
orbuild_constraint.(
depending on `is_vectorized'.
See also the description of the methods parse_constraint_head
and build_constraint
#
JuMP.parse_constraint_head
— Method
parse_constraint_head(error_fn::Function, ::Val{head}, args...)
Implement this method to intercept expression analysis operations with the main part of the `head'.
Expanding the constraint macro during analysis is a complex operation and may affect the existing JuMP syntax. Before publishing the code that implements these methods, discuss the possible consequences in https://gitter.im/JuliaOpt/jump-dev [developer chat]. |
Arguments
-
`error_fn': A function that accepts a `String' and outputs a string error with descriptive information about the macro from which it was issued.
-
head
: the.head
field of the interceptedExpr
expression. -
args...
: the '.args` field of theExpr
expression.
Return values
This function should return the following.
-
is_vectorized::Bool
: Does the expression represent a translatable expression, such asx .<= 1
; -
parse_code::Expr
: an expression containing the configuration or rewrite code to be called beforebuild_constraint
; -
build_code::Expr
: an expression that callsbuild_constraint(
orbuild_constraint.(
depending on `is_vectorized'.
Existing implementations
JuMP currently implements the following:
-
::Val{:call}
— redirects calls toparse_constraint_call
; -
::Val{:comparison}
— handles the special case ofl <= body <= u
.
See also the description of the methods parse_constraint_call
and build_constraint
#
JuMP.parse_one_operator_variable
— _Method
parse_one_operator_variable(
error_fn::Function,
info_expr::_VariableInfoExpr,
sense::Val{S},
value,
) where {S}
Updates the infoexr' for the variable expression in the macro `@variable
of the form `variable name S value'.
#
JuMP.parse_ternary_variable
— _Method
parse_ternary_variable(error_fn, info_expr, lhs_sense, lhs, rhs_sense, rhs)
A handler for JuMP extensions that is used to intercept the operations of analyzing the expression :comparison
, which has the form lhs lhs_sense variable rhs_sense rhs
.
#
JuMP.primal_feasibility_report
— Method
primal_feasibility_report(
model::GenericModel{T},
point::AbstractDict{GenericVariableRef{T},T} = _last_primal_solution(model),
atol::T = zero(T),
skip_missing::Bool = false,
)::Dict{Any,T}
For a given dictionary, point
, in which variables are mapped to direct values, returns a dictionary whose keys are constraints with a maximum tolerance of atol'. The value corresponding to each key represents a corresponding non-tolerance. The non-tolerance is defined as the distance between the direct constraint value (see `MOI.ConstraintPrimal
) and the nearest point along the Euclidean distance in the corresponding set.
Notes
-
If
skip_missing = true
, constraints containing variables missing from `point' are ignored. -
If
skip_missing = false
and a private direct solution is specified, an error is returned. -
If no point is specified, the direct solution obtained from the last solution of the model is used.
Example
julia> model = Model();
julia> @variable(model, 0.5 <= x <= 1);
julia> primal_feasibility_report(model, Dict(x => 0.2))
Dict{Any, Float64} with 1 entry:
x ≥ 0.5 => 0.3
#
JuMP.primal_feasibility_report
— Method
primal_feasibility_report(
point::Function,
model::GenericModel{T};
atol::T = zero(T),
skip_missing::Bool = false,
) where {T}
The primal_feasibility_report
form, in which the function is passed as the first argument, rather than the dictionary as the second argument.
Example
julia> model = Model();
julia> @variable(model, 0.5 <= x <= 1, start = 1.3);
julia> primal_feasibility_report(model) do v
return start_value(v)
end
Dict{Any, Float64} with 1 entry:
x ≤ 1 => 0.3
#
JuMP.primal_status
— Method
primal_status(model::GenericModel; result::Int = 1)
Returns the object MOI.ResultStatusCode
, describing the state of the last direct solver solution (that is, the attribute MOI.PrimalStatus
) associated with the index of the result `result'.
See also the description of the method result_count
.
Example
julia> import Ipopt
julia> model = Model(Ipopt.Optimizer);
julia> primal_status(model; result = 2)
NO_SOLUTION::ResultStatusCode = 0
#
JuMP.print_active_bridges
— Method
print_active_bridges(
[io::IO = stdout,]
model::GenericModel,
F::Type,
S::Type{<:MOI.AbstractSet},
)
Displays a list of bridges required to restrict the type “F` to `S”.
#
JuMP.print_active_bridges
— Method
print_active_bridges(
[io::IO = stdout,]
model::GenericModel,
S::Type{<:MOI.AbstractSet},
)
Outputs a list of bridges required to add a variable limited by the set of `S'.
#
JuMP.print_active_bridges
— Method
print_active_bridges([io::IO = stdout,] model::GenericModel)
Displays a list of bridges, variables, constraints, and goals that are currently used in the model.
#
JuMP.print_active_bridges
— Method
print_active_bridges([io::IO = stdout,] model::GenericModel, ::Type{F}) where {F}
Outputs a list of bridges required for an objective function of type `F'.
#
JuMP.print_bridge_graph
— Method
print_bridge_graph([io::IO,] model::GenericModel)
Outputs a hypergraph containing all types of variables, constraints, and goals that can be obtained by bridging variables, constraints, and goals present in the model.
This feature is intended for advanced users. To view only currently used bridges, use the method instead |
Explanation of the output data
Each node of the hypergraph corresponds to a type of variable, constraint, or goal.
-
The nodes of the variables are indicated by the characters
[ ]
. -
The restriction nodes are indicated by the characters
( )
. -
The goal nodes are indicated by the symbols
| |
. The number inside each pair of brackets is the index of the hypergraph node.
Note that this hypergraph is a complete list of possible transformations. When creating a bridge model, the shortest hyperpaths are selected in this graph, so many nodes may not be used.
For more information, see the work of the authors Legat, B., Dowson, O., Garcia, J., and Lubin, M. (2020). MathOptInterface: a data structure for mathematical optimization problems. URL address: https://arxiv.org/abs/2002.03447
#
JuMP.quad_terms
— Method
quad_terms(quad::GenericQuadExpr{C,V})
Provides an iterator over tuples (coefficient::C, var_1::V, var_2::V)
in the quadratic part of the quadratic expression.
#
JuMP.raw_status
— Method
raw_status(model::GenericModel)
Returns the reason for stopping the solver according to its own data (i.e., the MathOptInterface MOI' model attribute.RawStatusString
).
Example
julia> import Ipopt
julia> model = Model(Ipopt.Optimizer);
julia> raw_status(model)
"optimize not called"
#
JuMP.read_from_file
— Method
read_from_file(
filename::String;
format::MOI.FileFormats.FileFormat = MOI.FileFormats.FORMAT_AUTOMATIC,
kwargs...,
)
Returns the JuMP model read from the filename
in the `format' format.
If the file name ends with .gz
, it is decompressed using GZip. If the file name ends with `.bz2', it is decompressed using BZip2.
Other kwargs' arguments are passed to the 'Model
constructor of the selected format.
#
JuMP.reduced_cost
— Method
reduced_cost(x::GenericVariableRef{T})::T where {T}
Returns the cost reduction associated with the variable x
.
One interpretation of cost reduction is a change in the objective function due to an infinitesimal weakening of the variable boundaries.
This method is equivalent to requesting the shadow value of the active boundary of a variable (if it exists and is active).
See also the description of the method shadow_price
.
Example
julia> import HiGHS
julia> model = Model(HiGHS.Optimizer);
julia> set_silent(model)
julia> @variable(model, x <= 1);
julia> @objective(model, Max, 2 * x + 1);
julia> optimize!(model)
julia> has_duals(model)
true
julia> reduced_cost(x)
2.0
#
JuMP.register
— Method
register(
model::Model,
s::Symbol,
dimension::Integer,
f::Function,
∇f::Function,
∇²f::Function,
)
Registers a custom function f
, which takes a dimension
of arguments, in the model model
as an s
character. In addition, the gradient function ∇f
and the Hessian function ∇2f
are provided.
'∇f` and ∇2f
should return numbers corresponding to the first and second order derivatives of the function f
.
Compatibility
This feature is part of an outdated non-linear interface. It is recommended to use a new non-linear interface, which is described in the section Nonlinear Modeling. |
Notes
-
Since automatic differentiation is not used, we can assume that all input data is of type `Float64'.
-
When
dimension > 1
, this method returns an error. -
The 's` character does not have to match the
f
, but it improves readability.
Example
julia> model = Model();
julia> @variable(model, x)
x
julia> f(x::Float64) = x^2
f (generic function with 1 method)
julia> ∇f(x::Float64) = 2 * x
∇f (generic function with 1 method)
julia> ∇²f(x::Float64) = 2.0
∇²f (generic function with 1 method)
julia> register(model, :foo, 1, f, ∇f, ∇²f)
julia> @NLobjective(model, Min, foo(x))
#
JuMP.register
— Method
register(
model::Model,
s::Symbol,
dimension::Integer,
f::Function,
∇f::Function;
autodiff:Bool = false,
)
Registers a custom function f
, which takes a dimension
of arguments, in the model model
as an s
character. In addition, the gradient function ∇f
is provided.
The functions f
and ∇f
must support all subtypes of Real
as arguments. It should not be assumed that the input data is of type `Float64'.
Compatibility
This feature is part of an outdated non-linear interface. It is recommended to use a new non-linear interface, which is described in the section Nonlinear Modeling. |
Notes
-
If the function
f
is one-dimensional (i.e.dimension == 1
), the function∇f
should return a number representing the first-order derivative of the functionf
. -
If the function
f
is multidimensional, the function∇f
must have the signature∇f(g::AbstractVector{T}, args::T...) where {T<:Real}
, where the first argument is the vectorg
, which changes in place with a gradient. -
With
autodiff = true
anddimension == 1
, use automatic differentiation to calculate the second-order derivative. Withautodiff = false
, only the first-order derivative will be used. -
The 's` character does not have to match the 'f`, but it improves readability.
Example
julia> model = Model();
julia> @variable(model, x)
x
julia> f(x::T) where {T<:Real} = x^2
f (generic function with 1 method)
julia> ∇f(x::T) where {T<:Real} = 2 * x
∇f (generic function with 1 method)
julia> register(model, :foo, 1, f, ∇f; autodiff = true)
julia> @NLobjective(model, Min, foo(x))
julia> model = Model();
julia> @variable(model, x[1:2])
2-element Vector{VariableRef}:
x[1]
x[2]
julia> g(x::T, y::T) where {T<:Real} = x * y
g (generic function with 1 method)
julia> function ∇g(g::AbstractVector{T}, x::T, y::T) where {T<:Real}
g[1] = y
g[2] = x
return
end
∇g (generic function with 1 method)
julia> register(model, :g, 2, g, ∇g)
julia> @NLobjective(model, Min, g(x[1], x[2]))
#
JuMP.register
— Method
register(
model::Model,
op::Symbol,
dimension::Integer,
f::Function;
autodiff:Bool = false,
)
Registers a custom function f
, which takes a dimension
of arguments, in the model model
as an op
character.
The function f
must support all subtypes of Real
as arguments. It should not be assumed that the input data is of type `Float64'.
Compatibility
This feature is part of an outdated non-linear interface. It is recommended to use a new non-linear interface, which is described in the section Nonlinear Modeling. |
Notes
-
For this method, it is necessary to explicitly set
autodiff = true
, since the custom gradient function∇f
is not set. -
The second derivative is calculated only if
dimension == 1
. -
The character
op
does not have to match 'f`, but it improves readability.
Example
julia> model = Model();
julia> @variable(model, x)
x
julia> f(x::T) where {T<:Real} = x^2
f (generic function with 1 method)
julia> register(model, :foo, 1, f; autodiff = true)
julia> @NLobjective(model, Min, foo(x))
julia> model = Model();
julia> @variable(model, x[1:2])
2-element Vector{VariableRef}:
x[1]
x[2]
julia> g(x::T, y::T) where {T<:Real} = x * y
g (generic function with 1 method)
julia> register(model, :g, 2, g; autodiff = true)
julia> @NLobjective(model, Min, g(x[1], x[2]))
#
JuMP.relative_gap
— Method
relative_gap(model::GenericModel)
Returns the final relative optimality gap after calling optimize!(model)
.
The exact value depends on the implementation of MOI.RelativeGap
by the specific solver used for optimization.
This function is equivalent to requesting the MOI.RelativeGap
attribute.
Example
julia> import HiGHS
julia> model = Model(HiGHS.Optimizer);
julia> set_silent(model)
julia> @variable(model, x >= 1, Int);
julia> @objective(model, Min, 2 * x + 1);
julia> optimize!(model)
julia> relative_gap(model)
0.0
#
JuMP.relax_integrality
— Method
relax_integrality(model::GenericModel)
Modifies the model (`model'), "relaxing" all restrictions on the binary and integer variables. In particular:
-
The binary constraints are removed, and the boundaries of variables are tightened, if necessary, so that the variable is limited by an interval. ].
-
The integer constraints are removed without changing the boundaries of the variables.
-
If there are semi-continuous or semi-integer constraints, an error occurs (support for such constraints may be added in the future).
-
All other restrictions are ignored (remain unchanged). This includes discrete constraints such as SOS and indicator constraints.
Returns a function that can be called without arguments to restore the original model. If additional changes are made to the affected variables, the behavior of this function is undefined.
Example
julia> model = Model();
julia> @variable(model, x, Bin);
julia> @variable(model, 1 <= y <= 10, Int);
julia> @objective(model, Min, x + y);
julia> undo_relax = relax_integrality(model);
julia> print(model)
Min x + y
Subject to
x ≥ 0
y ≥ 1
x ≤ 1
y ≤ 10
julia> undo_relax()
julia> print(model)
Min x + y
Subject to
y ≥ 1
y ≤ 10
y integer
x binary
#
JuMP.relax_with_penalty!
— Method
relax_with_penalty!(
model::GenericModel{T},
[penalties::Dict{ConstraintRef,T}];
[default::Union{Nothing,Real} = nothing,]
) where {T}
Modifies the on-site model with destruction in order to ease restrictions with penalties.
This is a destructive subroutine that changes the model in place. To avoid changing the original model, use the method |
Reformulation
For more information about the reformulation, see the description of `MOI.Utilities.ScalarPenaltyRelaxation'.
For each ci
restriction, the penalty passed to MOI.Utilities.ScalarPenaltyRelaxation
is equal to get(penalties, ci, default)
. If the value is nothing
because ci
does not exist in penalties
and default = nothing
, the restriction is skipped.
Return value
This function returns the dictionary Dict{ConstraintRef,AffExpr}
, in which the index of each constraint is mapped to the corresponding expression y + z
as AffExpr
. In the optimal solution, the value of these functions is requested in order to calculate the violation of each constraint.
Relaxing a subset of constraints
To loosen a subset of the restrictions, pass a dictionary of penalties
and a set of `default = nothing'.
Example
julia> function new_model()
model = Model()
@variable(model, x)
@objective(model, Max, 2x + 1)
@constraint(model, c1, 2x - 1 <= -2)
@constraint(model, c2, 3x >= 0)
return model
end
new_model (generic function with 1 method)
julia> model_1 = new_model();
julia> penalty_map = relax_with_penalty!(model_1; default = 2.0);
julia> penalty_map[model_1[:c1]]
_[3]
julia> penalty_map[model_1[:c2]]
_[2]
julia> print(model_1)
Max 2 x - 2 _[2] - 2 _[3] + 1
Subject to
c2 : 3 x + _[2] ≥ 0
c1 : 2 x - _[3] ≤ -1
_[2] ≥ 0
_[3] ≥ 0
julia> model_2 = new_model();
julia> relax_with_penalty!(model_2, Dict(model_2[:c2] => 3.0))
Dict{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.GreaterThan{Float64}}, ScalarShape}, AffExpr} with 1 entry:
c2 : 3 x + _[2] ≥ 0 => _[2]
julia> print(model_2)
Max 2 x - 3 _[2] + 1
Subject to
c2 : 3 x + _[2] ≥ 0
c1 : 2 x ≤ -1
_[2] ≥ 0
#
JuMP.remove_bridge
— Method
remove_bridge(
model::GenericModel{S},
BT::Type{<:MOI.Bridges.AbstractBridge};
coefficient_type::Type{T} = S,
) where {S,T}
Deletes BT{T}
from the list of bridges that can be used to convert unsupported constraints into an equivalent form supported by the optimizer.
See also the description of the method add_bridge
.
Example
julia> model = Model();
julia> add_bridge(model, MOI.Bridges.Constraint.SOCtoNonConvexQuadBridge)
julia> remove_bridge(model, MOI.Bridges.Constraint.SOCtoNonConvexQuadBridge)
julia> add_bridge(
model,
MOI.Bridges.Constraint.NumberConversionBridge;
coefficient_type = Complex{Float64},
)
julia> remove_bridge(
model,
MOI.Bridges.Constraint.NumberConversionBridge;
coefficient_type = Complex{Float64},
)
#
JuMP.reshape_set
— Function
reshape_set(vectorized_set::MOI.AbstractSet, shape::AbstractShape)
Returns the set in its original shape
based on the vectorized vectorized_form
.
Example
For SymmetricMatrixShape
of vectorized shape [1, 2, 3] in MOI.PositiveSemidefinieConeTriangle(2)
the following code returns the set of the original constraint Symmetric(Matrix[1 2; 2 3]) in PSDCone()
:
julia> reshape_set(MOI.PositiveSemidefiniteConeTriangle(2), SymmetricMatrixShape(2))
PSDCone()
#
JuMP.reshape_vector
— Function
reshape_vector(vectorized_form::Vector, shape::AbstractShape)
Returns an object in its original shape' based on the vectorized `vectorized_form
.
Example
For SymmetricMatrixShape
of the vectorized shape [1, 2, 3]
the following code returns the matrix Symmetric(Matrix[1 2; 2 3])
:
julia> reshape_vector([1, 2, 3], SymmetricMatrixShape(2))
2×2 LinearAlgebra.Symmetric{Int64, Matrix{Int64}}:
1 2
2 3
#
JuMP.reverse_sense
— Function
reverse_sense(::Val{T}) where {T}
For the specified (non-) equality symbol, T
returns a new Val
object with the opposite (non-)equality symbol.
This feature is intended for use in JuMP extensions.
Example
julia> reverse_sense(Val(:>=))
Val{:<=}()
#
JuMP.set_attribute
— Method
set_attribute(model::GenericModel, attr::MOI.AbstractModelAttribute, value)
set_attribute(x::GenericVariableRef, attr::MOI.AbstractVariableAttribute, value)
set_attribute(cr::ConstraintRef, attr::MOI.AbstractConstraintAttribute, value)
Sets the value of the solver-related attribute attr
to value
.
It is equivalent to calling MOI.set
with the corresponding MOI model, and for variables and constraints, with the corresponding index MOI.VariableIndex
or `MOI.ConstraintIndex'.
Example
julia> model = Model();
julia> @variable(model, x)
x
julia> @constraint(model, c, 2 * x <= 1)
c : 2 x ≤ 1
julia> set_attribute(model, MOI.Name(), "model_new")
julia> set_attribute(x, MOI.VariableName(), "x_new")
julia> set_attribute(c, MOI.ConstraintName(), "c_new")
#
JuMP.set_attribute
— Method
set_attribute(
model::Union{GenericModel,MOI.OptimizerWithAttributes},
attr::Union{AbstractString,MOI.AbstractOptimizerAttribute},
value,
)
Sets the value of the solver-related attribute attr
to value
.
It is equivalent to calling MOI.set
with the corresponding MOI model.
If attr
is an AbstractString
string, it is converted to MOI.RawOptimizerAttribute
.
Example
julia> import HiGHS
julia> opt = optimizer_with_attributes(HiGHS.Optimizer, "output_flag" => false);
julia> model = Model(opt);
julia> set_attribute(model, "output_flag", false)
julia> set_attribute(model, MOI.RawOptimizerAttribute("output_flag"), true)
julia> set_attribute(opt, "output_flag", true)
julia> set_attribute(opt, MOI.RawOptimizerAttribute("output_flag"), false)
#
JuMP.set_attributes
— Method
set_attributes(
destination::Union{
GenericModel,
MOI.OptimizerWithAttributes,
GenericVariableRef,
ConstraintRef,
},
pairs::Pair...,
)
Calls set_attribute(destination, attribute, value)
for each pair from the list attribute => value
.
See also the description of the methods set_attribute
and get_attribute
.
Example
julia> import Ipopt
julia> model = Model(Ipopt.Optimizer);
julia> set_attributes(model, "tol" => 1e-4, "max_iter" => 100)
equivalent to the following:
julia> import Ipopt
julia> model = Model(Ipopt.Optimizer);
julia> set_attribute(model, "tol", 1e-4)
julia> set_attribute(model, "max_iter", 100)
#
JuMP.set_binary
— Method
set_binary(v::GenericVariableRef)
Adds a constraint according to which the variable v
must take values from the set .
See also the description BinaryRef
, is_binary
and unset_binary
.
Example
julia> model = Model();
julia> @variable(model, x);
julia> is_binary(x)
false
julia> set_binary(x)
julia> is_binary(x)
true
#
JuMP.set_dual_start_value
— Method
set_dual_start_value(con_ref::ConstraintRef, value)
Sets the dual initial value (MOI attribute ConstraintDualStart') of the constraint `value
to `con_ref'.
To remove a dual initial value, set it to `nothing'.
See also the description of the method dual_start_value
.
Example
julia> model = Model();
julia> @variable(model, x, start = 2.0);
julia> @constraint(model, c, [2x] in Nonnegatives())
c : [2 x] ∈ MathOptInterface.Nonnegatives(1)
julia> set_dual_start_value(c, [0.0])
julia> dual_start_value(c)
1-element Vector{Float64}:
0.0
julia> set_dual_start_value(c, nothing)
julia> dual_start_value(c)
#
JuMP.set_integer
— Method
set_integer(variable_ref::GenericVariableRef)
Adds an integer constraint for the variable `variable_ref'.
See also the description IntegerRef
, is_integer
and unset_integer
.
Example
julia> model = Model();
julia> @variable(model, x);
julia> is_integer(x)
false
julia> set_integer(x)
julia> is_integer(x)
true
#
JuMP.set_lower_bound
— Method
set_lower_bound(v::GenericVariableRef, lower::Number)
Sets the lower bound of the variable. If it doesn’t exist, it creates a new lower bound constraint.
See also the description LowerBoundRef
, has_lower_bound
, lower_bound
and delete_lower_bound
.
Example
julia> model = Model();
julia> @variable(model, x >= 1.0);
julia> lower_bound(x)
1.0
julia> set_lower_bound(x, 2.0)
julia> lower_bound(x)
2.0
#
JuMP.set_name
— Method
set_name(con_ref::ConstraintRef, s::AbstractString)
Sets the attribute of the constraint name.
Example
julia> model = Model();
julia> @variable(model, x);
julia> @constraint(model, c, [2x] in Nonnegatives())
c : [2 x] ∈ MathOptInterface.Nonnegatives(1)
julia> set_name(c, "my_constraint")
julia> name(c)
"my_constraint"
julia> c
my_constraint : [2 x] ∈ MathOptInterface.Nonnegatives(1)
#
JuMP.set_name
— Method
set_name(v::GenericVariableRef, s::AbstractString)
Sets the attribute of the variable name.
Example
julia> model = Model();
julia> @variable(model, x)
x
julia> set_name(x, "x_foo")
julia> x
x_foo
julia> name(x)
"x_foo"
#
JuMP.set_nonlinear_dual_start_value
— Method
set_nonlinear_dual_start_value(
model::Model,
start::Union{Nothing,Vector{Float64}},
)
Sets the value of the MOI attribute `MOI.NLPBlockDualStart'.
Compatibility
This feature is part of an outdated non-linear interface. It is recommended to use a new non-linear interface, which is described in the section Nonlinear Modeling. |
The initial vector corresponds to Lagrangian dual functions of nonlinear constraints in the order specified by the method all_nonlinear_constraints
. In other words, it is necessary to pass one initial vector corresponding to all nonlinear constraints in one function call; it is impossible to set a dual initial value for nonlinear constraints one at a time. The example below shows how to use all_nonlinear_constraints
create a mapping between the nonlinear constraint references and the initial vector.
To cancel the previous initial value, pass `nothing'.
Example
julia> model = Model();
julia> @variable(model, x[1:2]);
julia> nl1 = @NLconstraint(model, x[1] <= sqrt(x[2]));
julia> nl2 = @NLconstraint(model, x[1] >= exp(x[2]));
julia> start = Dict(nl1 => -1.0, nl2 => 1.0);
julia> start_vector = [start[con] for con in all_nonlinear_constraints(model)]
2-element Vector{Float64}:
-1.0
1.0
julia> set_nonlinear_dual_start_value(model, start_vector)
julia> nonlinear_dual_start_value(model)
2-element Vector{Float64}:
-1.0
1.0
#
JuMP.set_nonlinear_objective
— Method
set_nonlinear_objective(
model::Model,
sense::MOI.OptimizationSense,
expr::Expr,
)
Sets the expression expr
as a nonlinear objective function `model' with the optimization assignment `sense'.
This function is most useful if the expression expr
is generated programmatically and used @NLobjective
is not possible.
Compatibility
This feature is part of an outdated non-linear interface. It is recommended to use a new non-linear interface, which is described in the section Nonlinear Modeling. |
Notes
-
Variables should be interpolated directly into the expression `expr'.
-
Instead of
Min
andMax
, useMIN_SENSE
or `MAX_SENSE'.
Example
julia> model = Model();
julia> @variable(model, x);
julia> set_nonlinear_objective(model, MIN_SENSE, :($(x) + $(x)^2))
#
JuMP.set_normalized_coefficient
— Method
set_normalized_coefficient(
constraints::AbstractVector{<:ConstraintRef},
variables_1:AbstractVector{<:GenericVariableRef},
variables_2:AbstractVector{<:GenericVariableRef},
values::AbstractVector{<:Number},
)
Sets several quadratic coefficients related to variables_1
and variables_2
in the constraints equal to values
.
Note that before this step, JuMP combines members containing the same variable. For example, to limit 2x^2+3x^2 <=2
, call set_normalized_coefficient(con, [x], [x], [4])`will create a constraint of `+4x^2 ⇐2+
.
Example
julia> model = Model();
julia> @variable(model, x[1:2]);
julia> @constraint(model, con, 2x[1]^2 + 3 * x[1] * x[2] + x[2] <= 2)
con : 2 x[1]² + 3 x[1]*x[2] + x[2] ≤ 2
julia> set_normalized_coefficient([con, con], [x[1], x[1]], [x[1], x[2]], [4, 5])
julia> con
con : 4 x[1]² + 5 x[1]*x[2] + x[2] ≤ 2
#
JuMP.set_normalized_coefficient
— Method
set_normalized_coefficient(
constraints::AbstractVector{<:ConstraintRef},
variables::AbstractVector{<:GenericVariableRef},
values::AbstractVector{<:Number},
)
Sets the coefficients of the variables in the constraints to `values'.
Note that before this step, JuMP combines members containing the same variable. For example, to limit 2x + 3x <= 2
, call set_normalized_coefficient(con, [x], [4])
will create a limit of 4x <= 2
.
Example
julia> model = Model();
julia> @variable(model, x)
x
julia> @variable(model, y)
y
julia> @constraint(model, con, 2x + 3x + 4y <= 2)
con : 5 x + 4 y ≤ 2
julia> set_normalized_coefficient([con, con], [x, y], [6, 7])
julia> con
con : 6 x + 7 y ≤ 2
#
JuMP.set_normalized_coefficient
— Method
set_normalized_coefficient(
constraint::ConstraintRef,
variable_1:GenericVariableRef,
variable_2:GenericVariableRef,
value::Number,
)
Sets the quadratic coefficient associated with variable_1
and variable_2
in the constraint to `value'.
Note that before this step, JuMP combines members containing the same variable. For example, for the constraint 2x^2 + 3x^2<=2
, calling set_normalized_coefficient(con,x, x, 4)
will create the constraint 4x^2<=2
.
Example
julia> model = Model();
julia> @variable(model, x[1:2]);
julia> @constraint(model, con, 2x[1]^2 + 3 * x[1] * x[2] + x[2] <= 2)
con : 2 x[1]² + 3 x[1]*x[2] + x[2] ≤ 2
julia> set_normalized_coefficient(con, x[1], x[1], 4)
julia> set_normalized_coefficient(con, x[1], x[2], 5)
julia> con
con : 4 x[1]² + 5 x[1]*x[2] + x[2] ≤ 2
#
JuMP.set_normalized_coefficient
— Method
set_normalized_coefficient(
con_ref::ConstraintRef,
variable::AbstractVariableRef,
new_coefficients::Vector{Tuple{Int64,T}},
)
Sets the coefficients of the variable variable in the con_ref constraint to new_coeffients
, where each element in new_coeffients
is a tuple in which the string is mapped to the new coefficient.
Note that during constraint creation before this step, JuMP combines members containing the same variable.
Example
julia> model = Model();
julia> @variable(model, x)
x
julia> @constraint(model, con, [2x + 3x, 4x] in MOI.Nonnegatives(2))
con : [5 x, 4 x] ∈ MathOptInterface.Nonnegatives(2)
julia> set_normalized_coefficient(con, x, [(1, 2.0), (2, 5.0)])
julia> con
con : [2 x, 5 x] ∈ MathOptInterface.Nonnegatives(2)
#
JuMP.set_normalized_coefficient
— Method
set_normalized_coefficient(
constraint::ConstraintRef,
variable::GenericVariableRef,
value::Number,
)
Sets the coefficient of the variable variable in the constraint to `value'.
Note that before this step, JuMP combines members containing the same variable. For example, for the constraint 2x + 3x <= 2
, calling set_normalized_coefficient(con, x, 4)
will create the constraint 4x <= 2
.
Example
julia> model = Model();
julia> @variable(model, x)
x
julia> @constraint(model, con, 2x + 3x <= 2)
con : 5 x ≤ 2
julia> set_normalized_coefficient(con, x, 4)
julia> con
con : 4 x ≤ 2
#
JuMP.set_normalized_coefficients
— Method
set_normalized_coefficients(
constraint::ConstraintRef{<:AbstractModel,<:MOI.ConstraintIndex{F}},
variable::AbstractVariableRef,
new_coefficients::Vector{Tuple{Int64,T}},
) where {T,F<:Union{MOI.VectorAffineFunction{T},MOI.VectorQuadraticFunction{T}}}
An unrecommended method whose call is now redirected to set_normalized_coefficient
.
#
JuMP.set_normalized_rhs
— Method
set_normalized_rhs(
constraints::AbstractVector{<:ConstraintRef},
values::AbstractVector{<:Number}
)
Sets the members on the right side of the constraints to constraints
equal to `values'.
Note that before this step, JuMP outputs all free terms to the right side of the constraint. For example, to limit 2x + 1 <= 2
, call set_normalized_rhs([con], [4])`will create a constraint of `+2x ⇐ 4+
, not 2x + 1 <= 4
.
Example
julia> model = Model();
julia> @variable(model, x);
julia> @constraint(model, con1, 2x + 1 <= 2)
con1 : 2 x ≤ 1
julia> @constraint(model, con2, 3x + 2 <= 4)
con2 : 3 x ≤ 2
julia> set_normalized_rhs([con1, con2], [4, 5])
julia> con1
con1 : 2 x ≤ 4
julia> con2
con2 : 3 x ≤ 5
#
JuMP.set_normalized_rhs
— Method
set_normalized_rhs(constraint::ConstraintRef, value::Number)
Sets the right term of the constraint constraint
to `value'.
Note that before this step, JuMP outputs all free terms to the right side of the constraint. For example, for the constraint 2x +1 <= 2
, calling set_normalized_rhs(con, 4)
will create the constraint 2x <=4
, not 2x + 1 <= 4
.
Example
julia> model = Model();
julia> @variable(model, x);
julia> @constraint(model, con, 2x + 1 <= 2)
con : 2 x ≤ 1
julia> set_normalized_rhs(con, 4)
julia> con
con : 2 x ≤ 4
#
JuMP.set_objective
— Method
set_objective(model::AbstractModel, sense::MOI.OptimizationSense, func)
The functional equivalent of a macro @objective
.
Simultaneously sets the target purpose and the target function, which is equivalent to calling set_objective_sense
and set_objective_function
individually.
Example
julia> model = Model();
julia> @variable(model, x)
x
julia> set_objective(model, MIN_SENSE, x)
#
JuMP.set_objective_coefficient
— Method
set_objective_coefficient(
model::GenericModel{T},
variables_1::AbstractVector{<:GenericVariableRef{T}},
variables_2::AbstractVector{<:GenericVariableRef{T}},
coefficients::AbstractVector{<:Real},
) where {T}
Sets several coefficients of the quadratic objective function related to variables_1
and variables_2
equal to coefficients
in a single call.
Please note: if a nonlinear objective function is specified, this function will return an error.
Example
julia> model = Model();
julia> @variable(model, x[1:2]);
julia> @objective(model, Min, x[1]^2 + x[1] * x[2])
x[1]² + x[1]*x[2]
julia> set_objective_coefficient(model, [x[1], x[1]], [x[1], x[2]], [2, 3])
julia> objective_function(model)
2 x[1]² + 3 x[1]*x[2]
#
JuMP.set_objective_coefficient
— Method
set_objective_coefficient(
model::GenericModel,
variables::Vector{<:GenericVariableRef},
coefficients::Vector{<:Real},
)
Sets multiple coefficients of a linear objective function related to variables
equal to 'coefficients' in a single call.
Please note: if a non-linear objective function is specified, this function will return an error.
Example
julia> model = Model();
julia> @variable(model, x);
julia> @variable(model, y);
julia> @objective(model, Min, 3x + 2y + 1)
3 x + 2 y + 1
julia> set_objective_coefficient(model, [x, y], [5, 4])
julia> objective_function(model)
5 x + 4 y + 1
#
JuMP.set_objective_coefficient
— Method
set_objective_coefficient(
model::GenericModel{T},
variable_1::GenericVariableRef{T},
variable_2::GenericVariableRef{T},
coefficient::Real,
) where {T}
Sets the coefficient of the quadratic objective function associated with variable_1
and variable_2
to `coefficient'.
Please note: if a nonlinear objective function is specified, this function will return an error.
Example
julia> model = Model();
julia> @variable(model, x[1:2]);
julia> @objective(model, Min, x[1]^2 + x[1] * x[2])
x[1]² + x[1]*x[2]
julia> set_objective_coefficient(model, x[1], x[1], 2)
julia> set_objective_coefficient(model, x[1], x[2], 3)
julia> objective_function(model)
2 x[1]² + 3 x[1]*x[2]
#
JuMP.set_objective_coefficient
— Method
set_objective_coefficient(
model::GenericModel,
variable::GenericVariableRef,
coefficient::Real,
)
Sets the coefficient of the linear objective function associated with `variable' to `efficient'.
Please note: if a nonlinear objective function is specified, this function will return an error.
Example
julia> model = Model();
julia> @variable(model, x);
julia> @objective(model, Min, 2x + 1)
2 x + 1
julia> set_objective_coefficient(model, x, 3)
julia> objective_function(model)
3 x + 1
#
JuMP.set_objective_function
— Function
set_objective_function(model::GenericModel, func::MOI.AbstractFunction)
set_objective_function(model::GenericModel, func::AbstractJuMPScalar)
set_objective_function(model::GenericModel, func::Real)
set_objective_function(model::GenericModel, func::Vector{<:AbstractJuMPScalar})
Sets the specified function as the target function of the model.
If you need to set a target, see the description of the method. set_objective_sense
.
These are low-level functions; it is recommended to set the target function using a macro. @objective
.
Example
julia> model = Model();
julia> @variable(model, x);
julia> @objective(model, Min, x);
julia> objective_function(model)
x
julia> set_objective_function(model, 2 * x + 1)
julia> objective_function(model)
2 x + 1
#
JuMP.set_objective_sense
— Method
set_objective_sense(model::GenericModel, sense::MOI.OptimizationSense)
Sets the specified destination as the target destination of the model.
If you need to set a target function, see the function description set_objective_function
.
These are low-level functions; it is recommended to set the target function using a macro. @objective
.
Example
julia> model = Model();
julia> objective_sense(model)
FEASIBILITY_SENSE::OptimizationSense = 2
julia> set_objective_sense(model, MOI.MAX_SENSE)
julia> objective_sense(model)
MAX_SENSE::OptimizationSense = 1
#
JuMP.set_optimize_hook
— Method
set_optimize_hook(model::GenericModel, f::Union{Function,Nothing})
Sets the function f
as an optimization handler for `model'.
The function f
must have the signature f(model::GenericModel; kwargs...)
, where kwargs
are the arguments that are passed to optimize!
.
Notes
-
The optimization handler, as a rule, must change the model or some external state in some way, and then call
optimize!(model; ignore_optimize_hook = true)
to optimize the task bypassing the handler. -
To cancel the specified optimization handler, use the call
set_optimize_hook(model, nothing)
.
Example
julia> model = Model();
julia> function my_hook(model::Model; kwargs...)
println(kwargs)
println("Calling with `ignore_optimize_hook = true`")
optimize!(model; ignore_optimize_hook = true)
return
end
my_hook (generic function with 1 method)
julia> set_optimize_hook(model, my_hook)
my_hook (generic function with 1 method)
julia> optimize!(model; test_arg = true)
Base.Pairs{Symbol, Bool, Tuple{Symbol}, @NamedTuple{test_arg::Bool}}(:test_arg => 1)
Calling with `ignore_optimize_hook = true`
ERROR: NoOptimizer()
[...]
#
JuMP.set_optimizer
— Method
set_optimizer(
model::GenericModel,
optimizer_factory;
add_bridges::Bool = true,
)
Creates an empty instance of MathOptInterface.AbstractOptimizer
by calling optimizer_factory()
and sets it as the model
optimizer. In particular, the optimizer_factory
object should support a call without arguments and return an empty instance of `MathOptInterface.AbstractOptimizer'.
If the add_bridges
argument is set to true, constraints and objective functions that are not supported by the optimizer are automatically linked by the bridge to the equivalent supported formulation. Passing the value add_bridges = false
can improve performance if the solver supports all elements in the `model' initially.
For information about configuring optimization parameters specific to the solver, see the description of the method. set_attribute
.
Example
julia> import HiGHS
julia> model = Model();
julia> set_optimizer(model, () -> HiGHS.Optimizer())
julia> set_optimizer(model, HiGHS.Optimizer; add_bridges = false)
#
JuMP.set_optimizer_attribute
— Method
set_optimizer_attribute(
model::Union{GenericModel,MOI.OptimizerWithAttributes},
attr::Union{AbstractString,MOI.AbstractOptimizerAttribute},
value,
)
Sets the solver-related attribute attr
in model
to 'value'.
If attr
is the string AbstractString
, this method is equivalent to calling set_optimizer_attribute(model, MOI.RawOptimizerAttribute(name), value)
.
Compatibility
This method will remain in all releases of JuMP v1.X, but may be removed in a future v2.0 release. Instead, it is recommended to use |
See also the description of the methods set_optimizer_attributes
and get_optimizer_attribute
.
Example
julia> model = Model();
julia> set_optimizer_attribute(model, MOI.Silent(), true)
#
JuMP.set_optimizer_attributes
— Method
set_optimizer_attributes(
model::Union{GenericModel,MOI.OptimizerWithAttributes},
pairs::Pair...,
)
Calls set_optimizer_attribute(model, attribute, value)
for each pair from the list attribute => value
.
Compatibility
This method will remain in all releases of JuMP v1.X, but may be removed in a future v2.0 release. Instead, it is recommended to use |
See also the description of the methods set_optimizer_attribute
and get_optimizer_attribute
.
Example
julia> import Ipopt
julia> model = Model(Ipopt.Optimizer);
julia> set_optimizer_attributes(model, "tol" => 1e-4, "max_iter" => 100)
equivalent to the following:
julia> import Ipopt
julia> model = Model(Ipopt.Optimizer);
julia> set_optimizer_attribute(model, "tol", 1e-4)
julia> set_optimizer_attribute(model, "max_iter", 100)
#
JuMP.set_parameter_value
— Method
set_parameter_value(x::GenericVariableRef, value)
Changes the parameter limit for the variable x
to value
.
If x
is not a parameter, it returns an error.
See also the description ParameterRef
, is_parameter
and parameter_value
.
Example
julia> model = Model();
julia> @variable(model, p in Parameter(2))
p
julia> parameter_value(p)
2.0
julia> set_parameter_value(p, 2.5)
julia> parameter_value(p)
2.5
#
JuMP.set_silent
— Method
set_silent(model::GenericModel)
It takes precedence over all other attributes that control the level of detail, and instructs the solver not to output any data.
See also the description of the method unset_silent
.
Example
julia> import Ipopt
julia> model = Model(Ipopt.Optimizer);
julia> set_silent(model)
julia> get_attribute(model, MOI.Silent())
true
julia> unset_silent(model)
julia> get_attribute(model, MOI.Silent())
false
#
JuMP.set_start_value
— Method
set_start_value(con_ref::ConstraintRef, value)
Sets the direct initial value (MOI.ConstraintPrimalStart
) of the con_ref
constraint to value
.
To remove the direct initial value, set it to `nothing'.
See also the description of the method start_value
.
Example
julia> model = Model();
julia> @variable(model, x, start = 2.0);
julia> @constraint(model, c, [2x] in Nonnegatives())
c : [2 x] ∈ MathOptInterface.Nonnegatives(1)
julia> set_start_value(c, [4.0])
julia> start_value(c)
1-element Vector{Float64}:
4.0
julia> set_start_value(c, nothing)
julia> start_value(c)
#
JuMP.set_start_value
— Method
set_start_value(variable::GenericVariableRef, value::Union{Real,Nothing})
Sets the initial value (MOI.VariablePrimalStart') of the variable variable to `value
.
To cancel the initial value, pass `nothing'.
Please note: the value of `VariablePrimalStart' is sometimes called the "beginning of MIP" or "soft beginning".
See also the description of the methods has_start_value
and start_value
.
Example
julia> model = Model();
julia> @variable(model, x, start = 1.5);
julia> @variable(model, y);
julia> has_start_value(x)
true
julia> has_start_value(y)
false
julia> start_value(x)
1.5
julia> set_start_value(x, nothing)
julia> has_start_value(x)
false
julia> set_start_value(y, 2.0)
julia> has_start_value(y)
true
julia> start_value(y)
2.0
#
JuMP.set_start_values
— Method
set_start_values(
model::GenericModel;
variable_primal_start::Union{Nothing,Function} = value,
constraint_primal_start::Union{Nothing,Function} = value,
constraint_dual_start::Union{Nothing,Function} = dual,
nonlinear_dual_start::Union{Nothing,Function} = nonlinear_dual_start_value,
)
Sets the direct and dual initial values in the model
using the provided functions.
If any named argument is nothing
, the corresponding initial value is skipped.
If the optimizer does not support setting the initial value, the value is skipped.
variable_primal_start
This function controls the direct initial solution for variables. It is equivalent to calling set_start_value
](api.md#JuMP.set_start_value-Tuple{ConstraintRef{<:AbstractModel, <:MathOptInterface.ConstraintIndex{<:MathOptInterface.AbstractVectorFunction, <:MathOptInterface.AbstractVectorSet}}, Any}) for each variable or attribute setting [`MOI.VariablePrimalStart'.
If it is a function, it should have the form variable_primal_start(x::VariableRef)
, which maps each variable x
to a direct initial value.
By default, it is used value
.
constraint_primal_start
This function manages the direct initial solution for constraints. It is equivalent to calling set_start_value
](api.md#JuMP.set_start_value-Tuple{ConstraintRef{<:AbstractModel, <:MathOptInterface.ConstraintIndex{<:MathOptInterface.AbstractVectorFunction, <:MathOptInterface.AbstractVectorSet}}, Any}) for each constraint or attribute setting [MOI.ConstraintPrimalStart
.
If it is a function, it should have the form constraint_primal_start(ci::ConstraintRef)
, matching each constraint ci
with a direct initial value.
By default, it is used value
.
constraint_dual_start
This function manages the dual initial solution for constraints. It is equivalent to calling set_dual_start_value
](api.md#JuMP.set_dual_start_value-Tuple{ConstraintRef{<:AbstractModel, <:MathOptInterface.ConstraintIndex{<:MathOptInterface.AbstractVectorFunction, <:MathOptInterface.AbstractVectorSet}}, Any}) for each constraint or attribute setting [`MOI.ConstraintDualStart'.
If it is a function, it must have the form constraint_dual_start(ci::ConstraintRef)
, matching each constraint ci
with a dual initial value.
By default, it is used dual
.
nonlinear_dual_start
This function controls the dual initial solution for nonlinear constraints. It is equivalent to calling set_nonlinear_dual_start_value
.
If it is a function, it should have the form nonlinear_dual_start(model::GenericModel)
, which returns a vector that corresponds to the dual initial value of the constraints.
By default, it is used nonlinear_dual_start_value
.
#
JuMP.set_string_names_on_creation
— Method
set_string_names_on_creation(model::GenericModel, value::Bool)
Sets the named argument set_string_name' by default in macros
@variable` and @constraint
equal to `value'.
The named argument set_string_name
determines whether to assign names like String
to all variables and constraints in the `model'.
By default, value
has the value true'. However, for large models, calling `set_string_names_on_creation(model, false)
can improve performance by degrading the readability of the output data and solver log messages.
Example
julia> import HiGHS
julia> model = Model(HiGHS.Optimizer);
julia> set_string_names_on_creation(model)
true
julia> set_string_names_on_creation(model, false)
julia> set_string_names_on_creation(model)
false
#
JuMP.set_time_limit_sec
— Method
set_time_limit_sec(model::GenericModel, limit::Float64)
Sets the time limit (in seconds) for the solver.
The restriction can be lifted using the method unset_time_limit_sec
or by assigning the `limit' value to `nothing'.
See also the description of the methods unset_time_limit_sec
and time_limit_sec
.
Example
julia> import Ipopt
julia> model = Model(Ipopt.Optimizer);
julia> time_limit_sec(model)
julia> set_time_limit_sec(model, 60.0)
julia> time_limit_sec(model)
60.0
julia> unset_time_limit_sec(model)
julia> time_limit_sec(model)
#
JuMP.set_upper_bound
— Method
set_upper_bound(v::GenericVariableRef, upper::Number)
Sets the upper bound of the variable. If it does not exist, it creates a new upper limit constraint.
See also the description UpperBoundRef
, has_upper_bound
, upper_bound
and delete_upper_bound
.
Example
julia> model = Model();
julia> @variable(model, x <= 1.0);
julia> upper_bound(x)
1.0
julia> set_upper_bound(x, 2.0)
julia> upper_bound(x)
2.0
#
JuMP.set_value
— Method
set_value(p::NonlinearParameter, v::Number)
Stores the value of v
in the non-linear parameter `p'.
Compatibility
This feature is part of an outdated non-linear interface. It is recommended to use a new non-linear interface, which is described in the section Nonlinear Modeling. |
Example
julia> model = Model();
julia> @NLparameter(model, p == 0)
p == 0.0
julia> set_value(p, 5)
5
julia> value(p)
5.0
#
JuMP.shadow_price
— Method
shadow_price(con_ref::ConstraintRef)
Returns the amount of change in the objective function due to an infinitesimal relaxation of the constraint.
The shadow value is calculated based on dual
, and it can only be requested when has_duals
is set to true' and the target is `MIN_SENSE
or `MAX_SENSE' (but not `FEASIBILITY_SENSE').
See also the description of the method reduced_cost
.
Comparison with dual
Shadow values either do not differ from the dual
value, or they differ only in sign depending on the intended purpose. The differences are shown in the table:
Min |
Max |
|
---|---|---|
|
|
|
|
|
|
Notes
-
This function simply converts the sign of the
dual
value and does not check the conditions necessary to interpret the sensitivity of the shadow value. For example, the caller is responsible for checking the convergence of the solver to the optimal direct dual pair or for proving the inadmissibility. -
This calculation depends on the current purpose of the model. If it has changed since the last decision, the results will be incorrect.
-
The weakening of equality constraints (and therefore shadow value) is determined depending on the active purpose of the equality constraint.
Example
julia> import HiGHS
julia> model = Model(HiGHS.Optimizer);
julia> set_silent(model)
julia> @variable(model, x);
julia> @constraint(model, c, x <= 1)
c : x ≤ 1
julia> @objective(model, Max, 2 * x + 1);
julia> optimize!(model)
julia> has_duals(model)
true
julia> shadow_price(c)
2.0
#
JuMP.shape
— Function
shape(c::AbstractConstraint)::AbstractShape
Returns the constraint form c
.
Example
julia> model = Model();
julia> @variable(model, x[1:2]);
julia> c = @constraint(model, x[2] <= 1);
julia> shape(constraint_object(c))
ScalarShape()
julia> d = @constraint(model, x in SOS1());
julia> shape(constraint_object(d))
VectorShape()
#
JuMP.show_backend_summary
— Method
show_backend_summary(io::IO, model::GenericModel)
Outputs a summary of the optimizer on which the `model' model is based.
Extensions
This method should be implemented in the `AbstractModel'.
Example
julia> model = Model();
julia> show_backend_summary(stdout, model)
Model mode: AUTOMATIC
CachingOptimizer state: NO_OPTIMIZER
Solver name: No optimizer attached.
#
JuMP.show_constraints_summary
— Method
show_constraints_summary(io::IO, model::AbstractModel)
Writes a summary of the number of restrictions to the io
stream.
Extensions
This method should be implemented in the `AbstractModel'.
Example
julia> model = Model();
julia> @variable(model, x >= 0);
julia> show_constraints_summary(stdout, model)
`VariableRef`-in-`MathOptInterface.GreaterThan{Float64}`: 1 constraint
#
JuMP.show_objective_function_summary
— Method
show_objective_function_summary(io::IO, model::AbstractModel)
Writes a summary of the type of the target function to the `io' stream.
Extensions
This method should be implemented in the `AbstractModel'.
Example
julia> model = Model();
julia> show_objective_function_summary(stdout, model)
Objective function type: AffExpr
#
JuMP.simplex_iterations
— Method
simplex_iterations(model::GenericModel)
If available, returns the cumulative number of simplex iterations during the last optimization (the MOI.SimplexIterations
attribute).
If this attribute is not implemented by the solver, it returns the error `MOI.GetAttributeNotAllowed'.
Example
julia> import HiGHS
julia> model = Model(HiGHS.Optimizer);
julia> set_silent(model)
julia> optimize!(model)
julia> simplex_iterations(model)
0
#
JuMP.solution_summary
— Method
solution_summary(model::GenericModel; result::Int = 1, verbose::Bool = false)
Returns a structure that can be used to output a summary of the solution as a result of the result
.
With verbose=true
, a direct solution for each variable and a dual solution for each constraint are output, with the exception of those with empty names.
Example
When called to the REPL, the summary is displayed automatically.:
julia> model = Model();
julia> solution_summary(model)
* Solver : No optimizer attached.
* Status
Result count : 0
Termination status : OPTIMIZE_NOT_CALLED
Message from the solver:
"optimize not called"
* Candidate solution (result #1)
Primal status : NO_SOLUTION
Dual status : NO_SOLUTION
* Work counters
Use print
to force the summary output from the function:
julia> model = Model();
julia> function foo(model)
print(solution_summary(model))
return
end
foo (generic function with 1 method)
julia> foo(model)
* Solver : No optimizer attached.
* Status
Result count : 0
Termination status : OPTIMIZE_NOT_CALLED
Message from the solver:
"optimize not called"
* Candidate solution (result #1)
Primal status : NO_SOLUTION
Dual status : NO_SOLUTION
* Work counters
#
JuMP.solve_time
— Method
solve_time(model::GenericModel)
Returns the physical time of the solution in seconds, reported by the solver (the MOI' attribute.SolveTimeSec
), if available.
If this attribute is not implemented by the solver, it returns the error `MOI.GetAttributeNotAllowed'.
Example
julia> import HiGHS
julia> model = Model(HiGHS.Optimizer);
julia> set_silent(model)
julia> optimize!(model)
julia> solve_time(model)
1.0488089174032211e-5
#
JuMP.solver_name
— Method
solver_name(model::GenericModel)
Returns the `MOI' property.The SolverName of the base optimizer, if available.
If the optimizer is not connected, the AUTOMATIC
or MANUAL
mode returns the message "No optimizer attached."
.
If the attribute is not implemented, it returns the message "SolverName() attribute not implemented by the optimizer."
.
Example
julia> import Ipopt
julia> model = Model(Ipopt.Optimizer);
julia> solver_name(model)
"Ipopt"
julia> model = Model();
julia> solver_name(model)
"No optimizer attached."
julia> model = Model(MOI.FileFormats.MPS.Model);
julia> solver_name(model)
"SolverName() attribute not implemented by the optimizer."
#
JuMP.start_value
— Method
start_value(con_ref::ConstraintRef)
Returns the direct initial value (`MOI.ConstraintPrimalStart') of the constraint `con_ref'.
If no direct initial value is specified, `start_value' returns `nothing'.
See also the description of the method set_start_value
.
Example
julia> model = Model();
julia> @variable(model, x, start = 2.0);
julia> @constraint(model, c, [2x] in Nonnegatives())
c : [2 x] ∈ MathOptInterface.Nonnegatives(1)
julia> set_start_value(c, [4.0])
julia> start_value(c)
1-element Vector{Float64}:
4.0
julia> set_start_value(c, nothing)
julia> start_value(c)
#
JuMP.start_value
— Method
start_value(v::GenericVariableRef)
Returns the initial value (`MOI.VariablePrimalStart') of the variable `v'.
Please note: the value of VariablePrimalStart
is sometimes referred to as the "beginning of MIP" or "soft beginning".
See also the description of the methods has_start_value
and set_start_value
.
Example
julia> model = Model();
julia> @variable(model, x, start = 1.5);
julia> @variable(model, y);
julia> has_start_value(x)
true
julia> has_start_value(y)
false
julia> start_value(x)
1.5
julia> set_start_value(y, 2.0)
julia> has_start_value(y)
true
julia> start_value(y)
2.0
#
JuMP.termination_status
— Method
termination_status(model::GenericModel)
Returns the code `MOI.TerminationStatusCode' describing the reason for stopping the solver (that is, the attribute `MOI.TerminationStatus').
Example
julia> import Ipopt
julia> model = Model(Ipopt.Optimizer);
julia> termination_status(model)
OPTIMIZE_NOT_CALLED::TerminationStatusCode = 0
#
JuMP.time_limit_sec
— Method
time_limit_sec(model::GenericModel)
Returns the time limit (in seconds) for the `model'.
If no restriction is set, it returns `nothing'.
See also the description of the methods set_time_limit_sec
and unset_time_limit_sec
.
Example
julia> import Ipopt
julia> model = Model(Ipopt.Optimizer);
julia> time_limit_sec(model)
julia> set_time_limit_sec(model, 60.0)
julia> time_limit_sec(model)
60.0
julia> unset_time_limit_sec(model)
julia> time_limit_sec(model)
#
JuMP.triangle_vec
— Method
triangle_vec(matrix::Matrix)
Returns the upper triangle of the matrix combined into a vector in the order required by JuMP and MathOptInterface for the Triangle
sets.
Example
julia> model = Model();
julia> @variable(model, X[1:3, 1:3], Symmetric);
julia> @variable(model, t)
t
julia> @constraint(model, [t; triangle_vec(X)] in MOI.RootDetConeTriangle(3))
[t, X[1,1], X[1,2], X[2,2], X[1,3], X[2,3], X[3,3]] ∈ MathOptInterface.RootDetConeTriangle(3)
#
JuMP.unfix
— Method
unfix(v::GenericVariableRef)
Removes the fixing constraint of the variable.
If it doesn’t exist, it returns an error.
Example
julia> model = Model();
julia> @variable(model, x == 1);
julia> is_fixed(x)
true
julia> unfix(x)
julia> is_fixed(x)
false
#
JuMP.unregister
— Method
unregister(model::GenericModel, key::Symbol)
De-registers the name key
in model
so that a new variable, constraint, or expression with the same key can be created.
Note that the model[key]
object is not deleted; only the reference to model[key]
is deleted. To delete an object, also call the method delete
.
See also the description of the methods delete
and object_dictionary
.
Example
julia> model = Model();
julia> @variable(model, x)
x
julia> @variable(model, x)
ERROR: An object of name x is already attached to this model. If this
is intended, consider using the anonymous construction syntax, for example,
`x = @variable(model, [1:N], ...)` where the name of the object does
not appear inside the macro.
Alternatively, use `unregister(model, :x)` to first unregister
the existing name from the model. Note that this will not delete the
object; it will just remove the reference at `model[:x]`.
Stacktrace:
[...]
julia> num_variables(model)
1
julia> unregister(model, :x)
julia> @variable(model, x)
x
julia> num_variables(model)
2
#
JuMP.unsafe_backend
— Method
unsafe_backend(model::GenericModel)
Returns the innermost optimizer associated with the JuMP `model'.
This feature should only be used by advanced users who need access to the solver’s low-level capabilities. There is a high risk of its misuse. We strongly recommend using the alternative suggested below.
See also the description of the `backend' method.
To get the index of a variable or constraint in an insecure backend, use the method optimizer_index
.
Unsafe behavior
This feature is unsafe for two main reasons.
First, the wording and order of variables and constraints in an insecure backend may differ from those used in the model
model. The reason may be bridges or the fact that the solver requires a certain order of variables or constraints. In addition, the index of the variable or constraint returned by the method index
at the JuMP level may differ from the index of the corresponding variable or constraint in the `unsafe backend'. It is impossible to solve this problem. Instead, use the alternative suggested below.
Secondly, the unsafe backend
may be empty or may not contain some changes made to the JuMP model. Therefore, before calling unsafe_backend
, you should first call MOI.Utilities.attach_optimizer
to make sure that the backend is synchronized with the JuMP model.
julia> import HiGHS
julia> model = Model(HiGHS.Optimizer)
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: AUTOMATIC
CachingOptimizer state: EMPTY_OPTIMIZER
Solver name: HiGHS
julia> MOI.Utilities.attach_optimizer(model)
julia> inner = unsafe_backend(model)
A HiGHS model with 0 columns and 0 rows.
Moreover, if you change the JuMP model, the link to the backend (i.e. the inner
in the example above) may become outdated and you will have to call MOI.Utilities.attach_optimizer
one more time.
This function is also unsafe in the opposite direction: if you change an unsafe backend, for example, by adding a new constraint to the inner
, these changes can be rejected by JuMP without notification when changing or solving the JuMP `model'.
Alternative method
Instead of unsafe backend
, create a model using direct_model
](api.md#JuMP.direct_model-Tuple{MathOptInterface.ModelLike}) and call [`backend'.
For example, instead of the following code:
julia> import HiGHS
julia> model = Model(HiGHS.Optimizer);
julia> set_silent(model)
julia> @variable(model, x >= 0)
x
julia> MOI.Utilities.attach_optimizer(model)
julia> highs = unsafe_backend(model)
A HiGHS model with 1 columns and 0 rows.
julia> optimizer_index(x)
MOI.VariableIndex(1)
Use this one:
julia> import HiGHS
julia> model = direct_model(HiGHS.Optimizer());
julia> set_silent(model)
julia> @variable(model, x >= 0)
x
julia> highs = backend(model) # You don't need to call `attach_optimizer'.
A HiGHS model with 1 columns and 0 rows.
julia> index(x)
MOI.VariableIndex(1)
#
JuMP.unset_binary
— Method
unset_binary(variable_ref::GenericVariableRef)
Removes the binary constraint for the variable `variable_ref'.
See also the description BinaryRef
, is_binary
and set_binary
.
Example
julia> model = Model();
julia> @variable(model, x, Bin);
julia> is_binary(x)
true
julia> unset_binary(x)
julia> is_binary(x)
false
#
JuMP.unset_integer
— Method
unset_integer(variable_ref::GenericVariableRef)
Removes the integrity constraint for the variable `variable_ref'.
If it does not exist, it returns an error.
See also the description IntegerRef
, is_integer
and set_integer
.
Example
julia> model = Model();
julia> @variable(model, x, Int);
julia> is_integer(x)
true
julia> unset_integer(x)
julia> is_integer(x)
false
#
JuMP.unset_silent
— Method
unset_silent(model::GenericModel)
Neutralizes the effect of the set_silent
function, transferring control over the level of detail to the attributes of the solver.
See also the description of the method set_silent
.
Example
julia> import Ipopt
julia> model = Model(Ipopt.Optimizer);
julia> set_silent(model)
julia> get_attribute(model, MOI.Silent())
true
julia> unset_silent(model)
julia> get_attribute(model, MOI.Silent())
false
#
JuMP.unset_time_limit_sec
— Method
unset_time_limit_sec(model::GenericModel)
Removes the time limit for the solver.
See also the description of the methods set_time_limit_sec
and time_limit_sec
.
Example
julia> import Ipopt
julia> model = Model(Ipopt.Optimizer);
julia> time_limit_sec(model)
julia> set_time_limit_sec(model, 60.0)
julia> time_limit_sec(model)
60.0
julia> unset_time_limit_sec(model)
julia> time_limit_sec(model)
#
JuMP.upper_bound
— Method
upper_bound(v::GenericVariableRef)
Returns the upper bound of the variable.
If it does not exist, it returns an error.
See also the description UpperBoundRef
, has_upper_bound
, set_upper_bound
and delete_upper_bound
.
Example
julia> model = Model();
julia> @variable(model, x <= 1.0);
julia> upper_bound(x)
1.0
#
JuMP.value
— Method
value(con_ref::ConstraintRef; result::Int = 1)
Returns the direct value of the con_ref
constraint associated with the index of the result
of the most recent solution returned by the solver.
That is, if con_ref
is a reference to the constraint func
in set
, the value func
is returned, calculated from the value of the variables (set by value(::GenericVariableRef)
).
To check if the result exists before requesting the values, use the method has_values
.
See also the description of the method result_count
.
Note
For scalar constraints, the free term is transferred to the set
and therefore is not taken into account in the direct value of the constraint. For example, the constraint @constraint(model, 2x + 3y + 1 == 5)
is converted to “2x + 3y` in MOI.EqualTo(4)”, so the value returned by this function is the result of calculating `2x + 3y
.
#
JuMP.value
— Method
value(var_value::Function, con_ref::ConstraintRef)
Calculates the direct value of the con_ref
constraint using var_value(v)
as the value for each v
variable.
#
JuMP.value
— Method
value(var_value::Function, v::GenericVariableRef)
Calculates the value of the variable v
as `var_value(v)'.
#
JuMP.value
— Method
value(var_value::Function, c::NonlinearConstraintRef)
Calculates c' using `var_value(v)
as the value for each variable `v'.
#
JuMP.value
— Method
value(var_value::Function, ex::NonlinearExpression)
Calculates ex
using var_value(v)
as the value for each variable `v'.
#
JuMP.value
— Method
value(v::GenericAffExpr; result::Int = 1)
Returns the value of GenericAffExpr
v
, associated with the index of the result of the most recent solution returned by the solver.
See also the description of the method result_count
.
#
JuMP.value
— Method
value(v::GenericQuadExpr; result::Int = 1)
Returns the value of GenericQuadExpr
v
, associated with the index of the result of the most recent solution returned by the solver.
In most use cases, it replaces `getvalue'.
See also the description of the method result_count
.
#
JuMP.value
— Method
value(c::NonlinearConstraintRef; result::Int = 1)
Returns the value of NonlinearConstraintRef
c
associated with the index of the result of the most recent solution returned by the solver.
See also the description of the method result_count
.
#
JuMP.value
— Method
value(ex::NonlinearExpression; result::Int = 1)
Returns the value of the NonlinearExpression
ex
associated with the index of the result of the most recent solution returned by the solver.
See also the description of the method result_count
.
#
JuMP.value
— Method
value(p::NonlinearParameter)
Returns the current value stored in the non-linear parameter `p'.
Example
julia> model = Model();
julia> @NLparameter(model, p == 10)
p == 10.0
julia> value(p)
10.0
#
JuMP.value
— Method
value(v::GenericVariableRef; result = 1)
Returns the value of the variable v
associated with the index of the result of the most recent solution returned by the solver.
To check if the result exists before requesting the values, use the method has_values
.
See also the description of the method result_count
.
#
JuMP.value
— Method
value(var_value::Function, ex::GenericQuadExpr)
Calculates ex
using var_value(v)
as the value for each variable `v'.
#
JuMP.value
— Method
value(var_value::Function, ex::GenericAffExpr)
Calculates ex
using var_value(v)
as the value for each variable `v'.
#
JuMP.variable_by_name
— Method
variable_by_name(
model::AbstractModel,
name::String,
)::Union{AbstractVariableRef,Nothing}
Returns a reference to a variable with the name attribute name
or `Nothing' if none of the variables have such a name attribute. If several variables have the name attribute `name', it returns an error.
Example
julia> model = Model();
julia> @variable(model, x)
x
julia> variable_by_name(model, "x")
x
julia> @variable(model, base_name="x")
x
julia> variable_by_name(model, "x")
ERROR: Multiple variables have the name x.
Stacktrace:
[1] error(::String) at ./error.jl:33
[2] get(::MOIU.Model{Float64}, ::Type{MathOptInterface.VariableIndex}, ::String) at /home/blegat/.julia/dev/MathOptInterface/src/Utilities/model.jl:222
[3] get at /home/blegat/.julia/dev/MathOptInterface/src/Utilities/universalfallback.jl:201 [inlined]
[4] get(::MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.AbstractOptimizer,MathOptInterface.Utilities.UniversalFallback{MOIU.Model{Float64}}}, ::Type{MathOptInterface.VariableIndex}, ::String) at /home/blegat/.julia/dev/MathOptInterface/src/Utilities/cachingoptimizer.jl:490
[5] variable_by_name(::GenericModel, ::String) at /home/blegat/.julia/dev/JuMP/src/variables.jl:268
[6] top-level scope at none:0
julia> var = @variable(model, base_name="y")
y
julia> variable_by_name(model, "y")
y
julia> set_name(var, "z")
julia> variable_by_name(model, "y")
julia> variable_by_name(model, "z")
z
julia> @variable(model, u[1:2])
2-element Vector{VariableRef}:
u[1]
u[2]
julia> variable_by_name(model, "u[2]")
u[2]
#
JuMP.variable_ref_type
— Method
variable_ref_type(::Union{F,Type{F}}) where {F}
An auxiliary function used inside JuMP and some JuMP extensions. Returns the type of the variable associated with the type of the model or expression `F'.
#
JuMP.vectorize
— Function
vectorize(matrix::AbstractMatrix, ::Shape)
Converts the matrix
into a vector according to the `Shape'.
#
JuMP.write_to_file
— Method
write_to_file(
model::GenericModel,
filename::String;
format::MOI.FileFormats.FileFormat = MOI.FileFormats.FORMAT_AUTOMATIC,
kwargs...,
)
Writes the JuMP model' to the `filename
in the `format' format.
If the file name ends with `.gz', it is compressed using GZip. If the file name ends with `.bz2', it is compressed using BZip2.
Other kwargs' arguments are passed to the 'Model
constructor of the selected format.
#
MathOptInterface.Utilities.attach_optimizer
— Method
MOIU.attach_optimizer(model::GenericModel)
Calls MOIU.attach_optimizer
for the model
backend.
You can not call in direct mode.
#
MathOptInterface.Utilities.drop_optimizer
— Method
MOIU.drop_optimizer(model::GenericModel)
Calls MOIU.drop_optimizer
for the model
backend.
You can not call in direct mode.
#
MathOptInterface.Utilities.reset_optimizer
— Function
MOIU.reset_optimizer(model::GenericModel, optimizer::MOI.AbstractOptimizer)
Calls MOIU.reset_optimizer
for the model
backend.
You can not call in direct mode.
#
MathOptInterface.Utilities.reset_optimizer
— Method
MOIU.reset_optimizer(model::GenericModel)
Calls MOIU.reset_optimizer
for the model
backend.
You can not call in direct mode.
#
MathOptInterface.get
— Method
get(model::GenericModel, attr::MathOptInterface.AbstractModelAttribute)
Returns the value of the attr
attribute from the model’s MOI backend.
#
MathOptInterface.get
— Method
get(model::GenericModel, attr::MathOptInterface.AbstractOptimizerAttribute)
Returns the value of the attr
attribute from the model’s MOI backend.
#
JuMP.@NLconstraint
— Macro
@NLconstraint(model::GenericModel, expr)
Adds a constraint described by the nonlinear expression expr'. See also the description of the macro
@constraint`.
Compatibility
This macro is part of an outdated non-linear interface. It is recommended to use a new non-linear interface, which is described in the section Nonlinear Modeling. In most cases, '@NLconstraint` can be replaced with |
Example
julia> model = Model();
julia> @variable(model, x)
x
julia> @NLconstraint(model, sin(x) <= 1)
sin(x) - 1.0 ≤ 0
julia> @NLconstraint(model, [i = 1:3], sin(i * x) <= 1 / i)
3-element Vector{NonlinearConstraintRef{ScalarShape}}:
(sin(1.0 * x) - 1.0 / 1.0) - 0.0 ≤ 0
(sin(2.0 * x) - 1.0 / 2.0) - 0.0 ≤ 0
(sin(3.0 * x) - 1.0 / 3.0) - 0.0 ≤ 0
#
JuMP.@NLconstraints
— Macro
@NLconstraints(model, args...)
Adds several nonlinear constraints to the model at once, just like a macro does. @NLconstraint
.
The model should be the first argument, and several constraints can be added in several lines enclosed in the begin … end+
block.
The macro returns a tuple with certain constraints.
Compatibility
This macro is part of an outdated non-linear interface. It is recommended to use a new non-linear interface, which is described in the section Nonlinear Modeling. In most cases, '@NLconstraints` can be replaced with |
Example
julia> model = Model();
julia> @variable(model, x);
julia> @variable(model, y);
julia> @variable(model, t);
julia> @variable(model, z[1:2]);
julia> a = [4, 5];
julia> @NLconstraints(model, begin
t >= sqrt(x^2 + y^2)
[i = 1:2], z[i] <= log(a[i])
end)
((t - sqrt(x ^ 2.0 + y ^ 2.0)) - 0.0 ≥ 0, NonlinearConstraintRef{ScalarShape}[(z[1] - log(4.0)) - 0.0 ≤ 0, (z[2] - log(5.0)) - 0.0 ≤ 0])
#
JuMP.@NLexpression
— Macro
@NLexpression(args...)
Effectively creates a non-linear expression, which can then be inserted into other non-linear constraints and an objective function. See also the description of the macro `@expression'.
Compatibility
This macro is part of an outdated non-linear interface. It is recommended to use a new non-linear interface, which is described in the section Nonlinear Modeling. In most cases, |
Example
julia> model = Model();
julia> @variable(model, x)
x
julia> @variable(model, y)
y
julia> @NLexpression(model, my_expr, sin(x)^2 + cos(x^2))
subexpression[1]: sin(x) ^ 2.0 + cos(x ^ 2.0)
julia> @NLconstraint(model, my_expr + y >= 5)
(subexpression[1] + y) - 5.0 ≥ 0
julia> @NLobjective(model, Min, my_expr)
Indexing of sets and anonymous expressions is also supported.:
julia> @NLexpression(model, my_expr_1[i=1:3], sin(i * x))
3-element Vector{NonlinearExpression}:
subexpression[2]: sin(1.0 * x)
subexpression[3]: sin(2.0 * x)
subexpression[4]: sin(3.0 * x)
julia> my_expr_2 = @NLexpression(model, log(1 + sum(exp(my_expr_1[i]) for i in 1:2)))
subexpression[5]: log(1.0 + (exp(subexpression[2]) + exp(subexpression[3])))
#
JuMP.@NLexpressions
— Macro
@NLexpressions(model, args...)
Adds several nonlinear expressions to the model at once, just like a macro does. @NLexpression
.
The model should be the first argument, and several expressions can be added in several lines enclosed in the begin … end+
block.
The macro returns a tuple with certain expressions.
Compatibility
This macro is part of an outdated non-linear interface. It is recommended to use a new non-linear interface, which is described in the section Nonlinear Modeling. In most cases, |
Example
julia> model = Model();
julia> @variable(model, x);
julia> @variable(model, y);
julia> @variable(model, z[1:2]);
julia> a = [4, 5];
julia> @NLexpressions(model, begin
my_expr, sqrt(x^2 + y^2)
my_expr_1[i = 1:2], log(a[i]) - z[i]
end)
(subexpression[1]: sqrt(x ^ 2.0 + y ^ 2.0), NonlinearExpression[subexpression[2]: log(4.0) - z[1], subexpression[3]: log(5.0) - z[2]])
#
JuMP.@NLobjective
— Macro
@NLobjective(model, sense, expression)
Adds a non-linear objective function to the model
with the optimization assignment sense'. The `sense
argument must have the value Max
or `Min'.
Compatibility
This macro is part of an outdated non-linear interface. It is recommended to use a new non-linear interface, which is described in the section Nonlinear Modeling. In most cases, |
Example
julia> model = Model();
julia> @variable(model, x)
x
julia> @NLobjective(model, Max, 2x + 1 + sin(x))
julia> print(model)
Max 2.0 * x + 1.0 + sin(x)
Subject to
#
JuMP.@NLparameter
— Macro
@NLparameter(model, param == value)
Creates and returns a non-linear parameter param
associated with the model model
, with an initial value of `value'. Non-linear parameters can only be used in non-linear expressions.
Example
julia> model = Model();
julia> @NLparameter(model, x == 10)
x == 10.0
julia> value(x)
10.0
@NLparameter(model, value = param_value)
Creates and returns an anonymous non-linear parameter param
associated with the model model
, with the initial value `param_value'. Non-linear parameters can only be used in non-linear expressions.
Example
julia> model = Model();
julia> x = @NLparameter(model, value = 10)
parameter[1] == 10.0
julia> value(x)
10.0
@NLparameter(model, param_collection[...] == value_expr)
Creates and returns a collection of non-linear parameters param_collection
associated with the model model
, with an initial value of value_expr' (may depend on multiple indexes). To specify sets of indexes, it uses the same syntax as
@variable`.
Example
julia> model = Model();
julia> @NLparameter(model, y[i = 1:3] == 2 * i)
3-element Vector{NonlinearParameter}:
parameter[1] == 2.0
parameter[2] == 4.0
parameter[3] == 6.0
julia> value(y[2])
4.0
@NLparameter(model, [...] == value_expr)
Creates and returns an anonymous collection of non-linear parameters associated with the model
model, with an initial value of value_expr' (may depend on multiple indexes). To specify sets of indexes, it uses the same syntax as
@variable`.
Compatibility
This macro is part of an outdated non-linear interface. It is recommended to use a new non-linear interface, which is described in the section Nonlinear Modeling. In most cases, a call like |
Example
julia> model = Model();
julia> y = @NLparameter(model, [i = 1:3] == 2 * i)
3-element Vector{NonlinearParameter}:
parameter[1] == 2.0
parameter[2] == 4.0
parameter[3] == 6.0
julia> value(y[2])
4.0
#
JuMP.@NLparameters
— Macro
@NLparameters(model, args...)
Creates and returns several non-linear parameters related to the model
, just as the macro does. @NLparameter
.
The model should be the first argument, and several parameters can be added in several lines enclosed in the begin … end+
block. The individual parameters should be in separate lines, as in the following example.
The macro returns a tuple with certain parameters.
Compatibility
This macro is part of an outdated non-linear interface. It is recommended to use a new non-linear interface, which is described in the section Nonlinear Modeling. In most cases, a call like
can be replaced with
|
Example
julia> model = Model();
julia> @NLparameters(model, begin
x == 10
b == 156
end);
julia> value(x)
10.0
#
JuMP.@build_constraint
— Macro
@build_constraint(constraint_expr)
Creates a ScalarConstraint
or VectorConstraint
constraint using the same mechanism as the macro @constraint
, but does not add a constraint to the model.
Restrictions that use translation operators, such as x .<= 1
, ScalarConstraint
or VectorConstraint
constraint arrays are also supported and created.
Example
julia> model = Model();
julia> @variable(model, x);
julia> @build_constraint(2x >= 1)
ScalarConstraint{AffExpr, MathOptInterface.GreaterThan{Float64}}(2 x, MathOptInterface.GreaterThan{Float64}(1.0))
julia> model = Model();
julia> @variable(model, x[1:2]);
julia> @build_constraint(x .>= 0)
2-element Vector{ScalarConstraint{AffExpr, MathOptInterface.GreaterThan{Float64}}}:
ScalarConstraint{AffExpr, MathOptInterface.GreaterThan{Float64}}(x[1], MathOptInterface.GreaterThan{Float64}(-0.0))
ScalarConstraint{AffExpr, MathOptInterface.GreaterThan{Float64}}(x[2], MathOptInterface.GreaterThan{Float64}(-0.0))
#
JuMP.@constraint
— Macro
@constraint(model, expr, args...; kwargs...)
@constraint(model, [index_sets...], expr, args...; kwargs...)
@constraint(model, name, expr, args...; kwargs...)
@constraint(model, name[index_sets...], expr, args...; kwargs...)
Adds a constraint described by the expression `expr'.
The 'name` argument is optional. If multiple indexes are passed, a container is created and the restriction may depend on the indexes in the multiple indexes.
The expression expr
can have one of the following forms:
-
func in set
— thefunc
function must belong to theset
set, which is either `MOI.AbstractSet' or one of the abbreviated JuMP forms, for example -
a <op> b
, where<op>
is==
,≥
,>=
,≤
or<=
; -
l <= f <= u
oru >= f >=l
— the expressionf
must be in the range froml
tou
; -
'f(x) ⟂ x` — defines a complementary constraint;
-
z --> {expr}
— defines an indicator limit that is activated whenz
is equal to1
; -
!z --> {expr}
— defines an indicator limit that is activated whenz
is equal to0
; -
z <--> {expr}
— defines a materialized constraint; -
expr := rhs
— defines the constraint of logical equality.
Translated comparison operators such as .==
are also supported when the left and right sides of the comparison operator are arrays.
JuMP extensions may provide support for other constraint expressions that are not listed here.
Named arguments
-
base_name': Specifies the name prefix used to generate constraint names. For scalar constraints, the constraint name matches; otherwise, the constraint names will be `+base_name[…]+
for each index...
. -
container = :Auto
: forcibly sets the container type by passingcontainer = Array', `container = DenseAxisArray
,container = SparseAxisArray
or any other type of container supported by the JuMP extension. -
set_string_name::Bool = true
: determines whether to set theMOI.ConstraintName
attribute. Passing the valueset_string_name = false
improves performance.
JuMP extensions can support other named arguments as well.
Example
julia> model = Model();
julia> @variable(model, x[1:3]);
julia> @variable(model, z, Bin);
julia> @constraint(model, x in SecondOrderCone())
[x[1], x[2], x[3]] ∈ MathOptInterface.SecondOrderCone(3)
julia> @constraint(model, [i in 1:3], x[i] == i)
3-element Vector{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.EqualTo{Float64}}, ScalarShape}}:
x[1] = 1
x[2] = 2
x[3] = 3
julia> @constraint(model, x .== [1, 2, 3])
3-element Vector{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.EqualTo{Float64}}, ScalarShape}}:
x[1] = 1
x[2] = 2
x[3] = 3
julia> @constraint(model, con_name, 1 <= x[1] + x[2] <= 3)
con_name : x[1] + x[2] ∈ [1, 3]
julia> @constraint(model, con_perp[i in 1:3], x[i] - 1 ⟂ x[i])
3-element Vector{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.VectorAffineFunction{Float64}, MathOptInterface.Complements}, VectorShape}}:
con_perp[1] : [x[1] - 1, x[1]] ∈ MathOptInterface.Complements(2)
con_perp[2] : [x[2] - 1, x[2]] ∈ MathOptInterface.Complements(2)
con_perp[3] : [x[3] - 1, x[3]] ∈ MathOptInterface.Complements(2)
julia> @constraint(model, z --> {x[1] >= 0})
z --> {x[1] ≥ 0}
julia> @constraint(model, !z --> {2 * x[2] <= 3})
!z --> {2 x[2] ≤ 3}
#
JuMP.@constraints
— Macro
@constraints(model, args...)
Adds multiple groups of constraints at once, just like a macro does. @constraint
.
The model should be the first argument, and several constraints can be added in several lines enclosed in the begin … end+
block.
The macro returns a tuple with certain constraints.
Example
julia> model = Model();
julia> @variable(model, w);
julia> @variable(model, x);
julia> @variable(model, y);
julia> @variable(model, z[1:3]);
julia> @constraints(model, begin
x >= 1
y - w <= 2
sum_to_one[i=1:3], z[i] + y == 1
end);
julia> print(model)
Feasibility
Subject to
sum_to_one[1] : y + z[1] = 1
sum_to_one[2] : y + z[2] = 1
sum_to_one[3] : y + z[3] = 1
x ≥ 1
-w + y ≤ 2
#
JuMP.@expression
— Macro
@expression(model::GenericModel, expression)
@expression(model::GenericModel, [index_sets...], expression)
@expression(model::GenericModel, name, expression)
@expression(model::GenericModel, name[index_sets...], expression)
Effectively creates and returns an expression.
The 'name` argument is optional. If multiple indexes are passed, a container is created and the expression may depend on the indexes in the multiple indexes.
Named arguments
-
container = :Auto
: forcibly sets the container type by passingcontainer = Array', `container = DenseAxisArray
,container = SparseAxisArray
or any other type of container supported by the JuMP extension.
Example
julia> model = Model();
julia> @variable(model, x[1:5]);
julia> @expression(model, shared, sum(i * x[i] for i in 1:5))
x[1] + 2 x[2] + 3 x[3] + 4 x[4] + 5 x[5]
julia> shared
x[1] + 2 x[2] + 3 x[3] + 4 x[4] + 5 x[5]
As in the case of the macro @variable
, in the second argument, sets of indexes can be defined that can be used in constructing expressions:
julia> model = Model();
julia> @variable(model, x[1:3]);
julia> @expression(model, expr[i = 1:3], i * sum(x[j] for j in 1:3))
3-element Vector{AffExpr}:
x[1] + x[2] + x[3]
2 x[1] + 2 x[2] + 2 x[3]
3 x[1] + 3 x[2] + 3 x[3]
Anonymous syntax is also supported.:
julia> model = Model();
julia> @variable(model, x[1:3]);
julia> expr = @expression(model, [i in 1:3], i * sum(x[j] for j in 1:3))
3-element Vector{AffExpr}:
x[1] + x[2] + x[3]
2 x[1] + 2 x[2] + 2 x[3]
3 x[1] + 3 x[2] + 3 x[3]
#
JuMP.@expressions
— Macro
@expressions(model, args...)
Adds multiple expressions to the model at once, just like a macro does. @expression
.
The model should be the first argument, and several expressions can be added in several lines enclosed in the begin … end+
block.
The macro returns a tuple with certain expressions.
Example
julia> model = Model();
julia> @variable(model, x);
julia> @variable(model, y);
julia> @variable(model, z[1:2]);
julia> a = [4, 5];
julia> @expressions(model, begin
my_expr, x^2 + y^2
my_expr_1[i = 1:2], a[i] - z[i]
end)
(x² + y², AffExpr[-z[1] + 4, -z[2] + 5])
#
JuMP.@force_nonlinear
— Macro
@force_nonlinear(expr)
Modifies the way the expression expr
is analyzed so that it creates GenericNonlinearExpr
, not GenericAffExpr
or GenericQuadExpr
.
The principle of operation of this macro is to bypass expr
and replace all calls. +
, -
, *
, /
and ^
for calls that create GenericNonlinearExpr
.
This macro throws an error if the resulting expression does not create GenericNonlinearExpr
, for example, if it is applied to an expression that does not use basic arithmetic operators.
When should I use this macro
In most cases, this macro should not be used.
Use it only if the output type is assumed. `GenericNonlinearExpr', and ordinary macro calls destroy the structure of the task, or in those rare cases when a lot of intermediate variables are introduced as a result of ordinary macro calls, for example, due to the promotion of types to a general quadratic expression.
Example
The first use case: saving the task structure.
julia> model = Model();
julia> @variable(model, x);
julia> @expression(model, (x - 0.1)^2)
x² - 0.2 x + 0.010000000000000002
julia> @expression(model, @force_nonlinear((x - 0.1)^2))
(x - 0.1) ^ 2
julia> (x - 0.1)^2
x² - 0.2 x + 0.010000000000000002
julia> @force_nonlinear((x - 0.1)^2)
(x - 0.1) ^ 2
The second use case: reduction of allocated memory.
In this example, it is known that a nonlinear expression will be constructed based on x * 2.0 * (1 + x) * x
.
However, as a result of the default analysis, the following expressions are created first:
-
GenericAffExpr
a = x * 2.0
; -
another expression
GenericAffExpr
b = 1 + x
; -
GenericQuadExpr
c = a * b
; -
GenericNonlinearExpr
*(c, x)
.- In turn, the modified analysis creates the following expressions
-
GenericNonlinearExpr
a = GenericNonlinearExpr(:+, 1, x)
; -
GenericNonlinearExpr
GenericNonlinearExpr(:*, x, 2.0, a, x)
.
As a result, much less memory is allocated.
julia> model = Model();
julia> @variable(model, x);
julia> @expression(model, x * 2.0 * (1 + x) * x)
(2 x² + 2 x) * x
julia> @expression(model, @force_nonlinear(x * 2.0 * (1 + x) * x))
x * 2.0 * (1 + x) * x
julia> @allocated @expression(model, x * 2.0 * (1 + x) * x)
3200
julia> @allocated @expression(model, @force_nonlinear(x * 2.0 * (1 + x) * x))
640
#
JuMP.@objective
— Macro
@objective(model::GenericModel, sense, func)
Sets the target value to sense
and the target function to `func'.
The target value can be Min
, Max
, MOI.MIN_SENSE
, MOI.MAX_SENSE
or MOI.FEASIBILITY_SENSE'. To set the assignment programmatically, that is, when `sense
is the variable whose value is the assignment, you must use one of the three values of `MOI.OptimizationSense'.
Example
Minimizing the value of the variable x
:
julia> model = Model();
julia> @variable(model, x)
x
julia> @objective(model, Min, x)
x
Maximizing the value of the affine expression 2x - 1
:
julia> model = Model();
julia> @variable(model, x)
x
julia> @objective(model, Max, 2x - 1)
2 x - 1
Setting the target programmatically:
julia> model = Model();
julia> @variable(model, x)
x
julia> sense = MIN_SENSE
MIN_SENSE::OptimizationSense = 0
julia> @objective(model, sense, x^2 - 2x + 1)
x² - 2 x + 1
#
JuMP.@operator
— Macro
@operator(model, operator, dim, f[, ∇f[, ∇²f]])
Adds a non-linear operator operator
to the model model
with dim
arguments and creates a new object NonlinearOperator
with the name operator
in the current scope.
The function f
evaluates the operator and must return a scalar value.
The optional function ∇f
calculates the first derivative, and the optional function ∇2f
calculates the second derivative.
The function ∇2f
can only be passed if the function ∇f
is passed.
One-dimensional syntax
With dim == 1
, the method signatures of each function should be as follows:
-
f(::T)::T where {T<:Real}
-
∇f(::T)::T where {T<:Real}
-
∇²f(::T)::T where {T<:Real}
Multidimensional syntax
For dim > 1
, the method signatures of each function should be as follows:
-
f(x::T...)::T where {T<:Real}
-
∇f(g::AbstractVector{T}, x::T...)::Nothing where {T<:Real}
-
∇²f(H::AbstractMatrix{T}, x::T...)::Nothing where {T<:Real}
Where the gradient vector 'g` and the Hessian matrix H
are filled in place. For the Hessian, only the nonzero elements of the lower triangle need to be filled in. Setting an off-diagonal element of the upper triangle may result in an error.
Example
julia> model = Model();
julia> @variable(model, x)
x
julia> f(x::Float64) = x^2
f (generic function with 1 method)
julia> ∇f(x::Float64) = 2 * x
∇f (generic function with 1 method)
julia> ∇²f(x::Float64) = 2.0
∇²f (generic function with 1 method)
julia> @operator(model, op_f, 1, f, ∇f, ∇²f)
NonlinearOperator(f, :op_f)
julia> @objective(model, Min, op_f(x))
op_f(x)
julia> op_f(2.0)
4.0
julia> model[:op_f]
NonlinearOperator(f, :op_f)
julia> model[:op_f](x)
op_f(x)
Version without macro
This macro provides a convenient syntax consistent with the style of the rest of the JuMP macros. However, operators can be added without using a macro using the method add_nonlinear_operator
. For example:
julia> model = Model();
julia> f(x) = x^2
f (generic function with 1 method)
julia> @operator(model, op_f, 1, f)
NonlinearOperator(f, :op_f)
equivalent to
julia> model = Model();
julia> f(x) = x^2
f (generic function with 1 method)
julia> op_f = model[:op_f] = add_nonlinear_operator(model, 1, f; name = :op_f)
NonlinearOperator(f, :op_f)
#
JuMP.@variable
— Macro
@variable(model, expr, args..., kw_args...)
Adds a variable to the model described by the expression expr
, positional arguments args
and named arguments `kw_args'.
Anonymous and named variables
The expression expr
must have one of the following forms:
-
omitted (for example,
@variable(model)
); this creates an anonymous variable; -
a single character, for example
@variable(model, x)
; -
container expression, for example
@variable(model, x[i=1:3])
; -
an anonymous container expression, for example `@variable(model, [i=1:3])'.
Borders
In addition, the expression may have boundaries, for example:
-
@variable(model, x >= 0)
-
@variable(model, x <= 0)
-
@variable(model, x == 0)
-
@variable(model, 0 <= x <= 1)
The boundaries may depend on the indexes of the container expression:
-
@variable(model, -i <= x[i=1:3] <= i)
Sets
You can explicitly specify the set to which the variable belongs.:
-
@variable(model, x in MOI.Interval(0.0, 1.0))
For more information about this syntax, see the section Variables constrained on creation.
Positional arguments
The following positional arguments are recognized in args
:
-
Bin
: restricts the variable to the set ofMOI.ZeroOne', that is `{0, 1}
. For example,@variable(model, x, Bin)'. Please note: calling `@variable(model, Bin)`is invalid; use the named argument `binary
instead. -
Int
: restricts a variable to a set of integers, i.e. …, -2, -1,
0, 1, 2, … For example, @variable(model, x, Int)'. Please note: the call to `@variable(model, Int)`is invalid; use the named argument `integer
instead.
-
Symmetrical
: available only when creating a square matrix of variables, that is, whenexpr
has the formvarname[1:n,1:n]
orvarname[i=1:n,j=1:n]
, a symmetric matrix of variables is created. -
'PSD': a restrictive extension of
Symmetrical', which restricts a square matrix of variables to the shape of `Symmetrical
, and constraints to a positively semi-definite shape.
Named arguments
Under all conditions, four named arguments are useful.:
-
base_name': Specifies the name prefix used to generate variable names. For scalar variables, it corresponds to the variable name; otherwise, the variable names will be equal to `+base_name[…]+
for each index...
along the axes. -
start::Float64': defines the value passed to `set_start_value
for each variable. -
`container': sets the container type. For more information, see Forced setting of the container type.
-
set_string_name::Bool = true
: determines whether to set theMOI.variableName
attribute. Passingset_string_name = false
can improve performance.
Other named arguments are needed to disambiguate anonymous variables.:
-
lower_bound::Float64
: alternative forx >= lb
; sets the value of the lower bound of the variable. -
upper_bound::Float64
: an alternative forx <= ub
; sets the value of the upper bound of the variable. -
binary::Bool
: an alternative to passingBin
; indicates whether the variable is binary. -
integer::Bool
: an alternative to passingInt
; indicates whether the variable is an integer. -
`set::MOI.AbstractSet': an alternative to using `x in set'.
-
variable_type': used by JuMP extensions. For more information, see Macro extension `@variable
.
Example
Below are equivalent ways to create a variable x
with the name x
and a lower bound of 0.:
julia> model = Model();
julia> @variable(model, x >= 0)
x
julia> model = Model();
julia> @variable(model, x, lower_bound = 0)
x
julia> model = Model();
julia> x = @variable(model, base_name = "x", lower_bound = 0)
x
Other examples:
julia> model = Model();
julia> @variable(model, x[i=1:3] <= i, Int, start = sqrt(i), lower_bound = -i)
3-element Vector{VariableRef}:
x[1]
x[2]
x[3]
julia> @variable(model, y[i=1:3], container = DenseAxisArray, set = MOI.ZeroOne())
1-dimensional DenseAxisArray{VariableRef,1,...} with index sets:
Dimension 1, Base.OneTo(3)
And data, a 3-element Vector{VariableRef}:
y[1]
y[2]
y[3]
julia> @variable(model, z[i=1:3], set_string_name = false)
3-element Vector{VariableRef}:
_[7]
_[8]
_[9]
#
JuMP.@variables
— Macro
@variables(model, args...)
Adds multiple variables to the model at once, just like a macro does. @variable
.
The model should be the first argument, and several variables can be added in several lines enclosed in the begin … end+
block.
The macro returns a tuple with certain variables.
Example
julia> model = Model();
julia> @variables(model, begin
x
y[i = 1:2] >= 0, (start = i)
z, Bin, (start = 0, base_name = "Z")
end)
(x, VariableRef[y[1], y[2]], Z)
Named arguments must be enclosed in parentheses (see the example above). |
#
JuMP.TerminationStatusCode
— Type
TerminationStatusCode
Enumeration of possible values of the TerminationStatus
attribute. This attribute describes the reason why the optimizer stopped executing on the last call. optimize!
.
Values
Possible values:
-
OPTIMIZE_NOT_CALLED
: The algorithm has not started. -
OPTIMAL
: The algorithm found a globally optimal solution. -
INFEASIBLE
: The algorithm concluded that no feasible solution exists. -
DUAL_INFEASIBLE
: The algorithm concluded that no dual bound exists for the problem. If, additionally, a feasible (primal) solution is known to exist, this status typically implies that the problem is unbounded, with some technical exceptions. -
LOCALLY_SOLVED
: The algorithm converged to a stationary point, local optimal solution, could not find directions for improvement, or otherwise completed its search without global guarantees. -
LOCALLY_INFEASIBLE
: The algorithm converged to an infeasible point or otherwise completed its search without finding a feasible solution, without guarantees that no feasible solution exists. -
INFEASIBLE_OR_UNBOUNDED
: The algorithm stopped because it decided that the problem is infeasible or unbounded; this occasionally happens during MIP presolve. -
ALMOST_OPTIMAL
: The algorithm found a globally optimal solution to relaxed tolerances. -
ALMOST_INFEASIBLE
: The algorithm concluded that no feasible solution exists within relaxed tolerances. -
ALMOST_DUAL_INFEASIBLE
: The algorithm concluded that no dual bound exists for the problem within relaxed tolerances. -
ALMOST_LOCALLY_SOLVED
: The algorithm converged to a stationary point, local optimal solution, or could not find directions for improvement within relaxed tolerances. -
ITERATION_LIMIT
: An iterative algorithm stopped after conducting the maximum number of iterations. -
TIME_LIMIT
: The algorithm stopped after a user-specified computation time. -
NODE_LIMIT
: A branch-and-bound algorithm stopped because it explored a maximum number of nodes in the branch-and-bound tree. -
SOLUTION_LIMIT
: The algorithm stopped because it found the required number of solutions. This is often used in MIPs to get the solver to return the first feasible solution it encounters. -
MEMORY_LIMIT
: The algorithm stopped because it ran out of memory. -
OBJECTIVE_LIMIT
: The algorithm stopped because it found a solution better than a minimum limit set by the user. -
NORM_LIMIT
: The algorithm stopped because the norm of an iterate became too large. -
OTHER_LIMIT
: The algorithm stopped due to a limit not covered by one of theLIMIT
statuses above. -
SLOW_PROGRESS
: The algorithm stopped because it was unable to continue making progress towards the solution. -
NUMERICAL_ERROR
: The algorithm stopped because it encountered unrecoverable numerical error. -
INVALID_MODEL
: The algorithm stopped because the model is invalid. -
INVALID_OPTION
: The algorithm stopped because it was provided an invalid option. -
INTERRUPTED
: The algorithm stopped because of an interrupt signal. -
OTHER_ERROR
: The algorithm stopped because of an error not covered by one of the statuses defined above.
#
JuMP.ResultStatusCode
— Type
ResultStatusCode
Enumeration of possible values of the PrimalStatus
and DualStatus
attributes.
The values indicate how the resulting vector should be interpreted.
Values
Possible values:
-
NO_SOLUTION
: the result vector is empty. -
FEASIBLE_POINT
: the result vector is a feasible point. -
NEARLY_FEASIBLE_POINT
: the result vector is feasible if some constraint tolerances are relaxed. -
INFEASIBLE_POINT
: the result vector is an infeasible point. -
INFEASIBILITY_CERTIFICATE
: the result vector is an infeasibility certificate. If thePrimalStatus
isINFEASIBILITY_CERTIFICATE
, then the primal result vector is a certificate of dual infeasibility. If theDualStatus
isINFEASIBILITY_CERTIFICATE
, then the dual result vector is a proof of primal infeasibility. -
NEARLY_INFEASIBILITY_CERTIFICATE
: the result satisfies a relaxed criterion for a certificate of infeasibility. -
REDUCTION_CERTIFICATE
: the result vector is an ill-posed certificate; see this article for details. If thePrimalStatus
isREDUCTION_CERTIFICATE
, then the primal result vector is a proof that the dual problem is ill-posed. If theDualStatus
isREDUCTION_CERTIFICATE
, then the dual result vector is a proof that the primal is ill-posed. -
NEARLY_REDUCTION_CERTIFICATE
: the result satisfies a relaxed criterion for an ill-posed certificate. -
UNKNOWN_RESULT_STATUS
: the result vector contains a solution with an unknown interpretation. -
OTHER_RESULT_STATUS
: the result vector contains a solution with an interpretation not covered by one of the statuses defined above
#
JuMP.OptimizationSense
— Type
OptimizationSense
Enumeration of the values of the "ObjectiveSense" attribute.
Values
Possible values:
-
MIN_SENSE
: the goal is to minimize the objective function -
MAX_SENSE
: the goal is to maximize the objective function -
FEASIBILITY_SENSE
: the model does not have an objective function
#
JuMP.VariableRef
— Type
GenericVariableRef{T} <: AbstractVariableRef
Contains a reference to the model and the corresponding MOI.VariableIndex.