Getting started with JuMP
Страница в процессе перевода. |
This tutorial was generated using Literate.jl. Download the source as a .jl
file.
This tutorial is aimed at providing a quick introduction to writing and solving optimization models with JuMP.
If you’re new to Julia, start by reading Getting started with Julia.
What is JuMP?
JuMP ("Julia for Mathematical Programming") is an open-source modeling language that is embedded in Julia. It allows users to formulate various classes of optimization problems (linear, mixed-integer, quadratic, conic quadratic, semidefinite, and nonlinear) with easy-to-read code. These problems can then be solved using state-of-the-art open-source and commercial solvers.
JuMP also makes advanced optimization techniques easily accessible from a high-level language.
What is a solver?
A solver is a software package that incorporates algorithms for finding solutions to one or more classes of problem.
For example, HiGHS is a solver for linear programming (LP) and mixed integer programming (MIP) problems. It incorporates algorithms such as the simplex method and the interior-point method.
The Supported-solvers table lists the open-source and commercial solvers that JuMP currently supports.
What is MathOptInterface?
Each solver has its own concepts and data structures for representing optimization models and obtaining results.
MathOptInterface (MOI) is an abstraction layer that JuMP uses to convert from the problem written in JuMP to the solver-specific data structures for each solver.
MOI can be used directly, or through a higher-level modeling interface like JuMP.
Because JuMP is built on top of MOI, you’ll often see the MathOptInterface.
prefix displayed when JuMP types are printed. However, you’ll only need to understand and interact with MOI to accomplish advanced tasks such as creating solver-independent callbacks.
Installation
JuMP is a package for Julia. From Julia, JuMP is installed by using the built-in package manager.
import Pkg
Pkg.add("JuMP")
You also need to include a Julia package which provides an appropriate solver. One such solver is HiGHS.Optimizer
, which is provided by the HiGHS.jl package.
import Pkg
Pkg.add("HiGHS")
See Installation Guide for a list of other solvers you can use.
An example
Let’s solve the following linear programming problem using JuMP and HiGHS. We will first look at the complete code to solve the problem and then go through it step by step.
Here’s the problem:
And here’s the code to solve this problem:
julia> using JuMP
julia> using HiGHS
julia> model = Model(HiGHS.Optimizer)
A JuMP Model
├ solver: HiGHS
├ objective_sense: FEASIBILITY_SENSE
├ num_variables: 0
├ num_constraints: 0
└ Names registered in the model: none
julia> @variable(model, x >= 0)
x
julia> @variable(model, 0 <= y <= 3)
y
julia> @objective(model, Min, 12x + 20y)
12 x + 20 y
julia> @constraint(model, c1, 6x + 8y >= 100)
c1 : 6 x + 8 y ≥ 100
julia> @constraint(model, c2, 7x + 12y >= 120)
c2 : 7 x + 12 y ≥ 120
julia> print(model)
Min 12 x + 20 y
Subject to
c1 : 6 x + 8 y ≥ 100
c2 : 7 x + 12 y ≥ 120
x ≥ 0
y ≥ 0
y ≤ 3
julia> optimize!(model)
Running HiGHS 1.7.2 (git hash: 5ce7a2753): Copyright (c) 2024 HiGHS under MIT licence terms
Coefficient ranges:
Matrix [6e+00, 1e+01]
Cost [1e+01, 2e+01]
Bound [3e+00, 3e+00]
RHS [1e+02, 1e+02]
Presolving model
2 rows, 2 cols, 4 nonzeros 0s
2 rows, 2 cols, 4 nonzeros 0s
Presolve : Reductions: rows 2(-0); columns 2(-0); elements 4(-0) - Not reduced
Problem not reduced by presolve: solving the LP
Using EKK dual simplex solver - serial
Iteration Objective Infeasibilities num(sum)
0 0.0000000000e+00 Pr: 2(220) 0s
2 2.0500000000e+02 Pr: 0(0) 0s
Model status : Optimal
Simplex iterations: 2
Objective value : 2.0500000000e+02
HiGHS run time : 0.00
julia> termination_status(model)
OPTIMAL::TerminationStatusCode = 1
julia> primal_status(model)
FEASIBLE_POINT::ResultStatusCode = 1
julia> dual_status(model)
FEASIBLE_POINT::ResultStatusCode = 1
julia> objective_value(model)
204.99999999999997
julia> value(x)
15.000000000000005
julia> value(y)
1.249999999999996
julia> shadow_price(c1)
-0.24999999999999922
julia> shadow_price(c2)
-1.5000000000000007
Step-by-step
Once JuMP is installed, to use JuMP in your programs write:
julia> using JuMP
We also need to include a Julia package which provides an appropriate solver. We want to use HiGHS.Optimizer
here which is provided by the HiGHS.jl
package:
julia> using HiGHS
JuMP builds problems incrementally in a Model
object. Create a model by passing an optimizer to the Model
function:
julia> model = Model(HiGHS.Optimizer)
A JuMP Model
├ solver: HiGHS
├ objective_sense: FEASIBILITY_SENSE
├ num_variables: 0
├ num_constraints: 0
└ Names registered in the model: none
Variables are modeled using @variable
:
julia> @variable(model, x >= 0)
x
The macro creates a new Julia object,
but the macro does this automatically for us to save writing |
Variables can have lower and upper bounds:
julia> @variable(model, 0 <= y <= 30)
y
The objective is set using @objective
:
julia> @objective(model, Min, 12x + 20y)
12 x + 20 y
Constraints are modeled using @constraint
. Here, c1
and c2
are the names of our constraint:
julia> @constraint(model, c1, 6x + 8y >= 100)
c1 : 6 x + 8 y ≥ 100
julia> @constraint(model, c2, 7x + 12y >= 120)
c2 : 7 x + 12 y ≥ 120
Call print
to display the model:
julia> print(model)
Min 12 x + 20 y
Subject to
c1 : 6 x + 8 y ≥ 100
c2 : 7 x + 12 y ≥ 120
x ≥ 0
y ≥ 0
y ≤ 30
To solve the optimization problem, call the optimize!
function:
julia> optimize!(model)
Running HiGHS 1.7.2 (git hash: 5ce7a2753): Copyright (c) 2024 HiGHS under MIT licence terms
Coefficient ranges:
Matrix [6e+00, 1e+01]
Cost [1e+01, 2e+01]
Bound [3e+01, 3e+01]
RHS [1e+02, 1e+02]
Presolving model
2 rows, 2 cols, 4 nonzeros 0s
2 rows, 2 cols, 4 nonzeros 0s
Presolve : Reductions: rows 2(-0); columns 2(-0); elements 4(-0) - Not reduced
Problem not reduced by presolve: solving the LP
Using EKK dual simplex solver - serial
Iteration Objective Infeasibilities num(sum)
0 0.0000000000e+00 Pr: 2(220) 0s
2 2.0500000000e+02 Pr: 0(0) 0s
Model status : Optimal
Simplex iterations: 2
Objective value : 2.0500000000e+02
HiGHS run time : 0.00
The |
Now let’s see what information we can query about the solution, starting with is_solved_and_feasible
:
julia> is_solved_and_feasible(model)
true
We can get more information about the solution by querying the three types of statuses.
termination_status
tells us why the solver stopped:
julia> termination_status(model)
OPTIMAL::TerminationStatusCode = 1
In this case, the solver found an optimal solution.
Check primal_status
to see if the solver found a primal feasible point:
julia> primal_status(model)
FEASIBLE_POINT::ResultStatusCode = 1
and dual_status
to see if the solver found a dual feasible point:
julia> dual_status(model)
FEASIBLE_POINT::ResultStatusCode = 1
Now we know that our solver found an optimal solution, and that it has a primal and a dual solution to query.
Query the objective value using objective_value
:
julia> objective_value(model)
205.0
the primal solution using value
:
julia> value(x)
15.000000000000004
julia> value(y)
1.2499999999999976
and the dual solution using shadow_price
:
julia> shadow_price(c1)
-0.24999999999999917
julia> shadow_price(c2)
-1.5000000000000007
You should always check whether the solver found a solution before calling solution functions like |
```julia optimize!(model) if !is_solved_and_feasible(model) error("Solver did not find an optimal solution") end ```
That’s it for our simple model. In the rest of this tutorial, we expand on some of the basic JuMP operations.
Model basics
Create a model by passing an optimizer:
julia> model = Model(HiGHS.Optimizer)
A JuMP Model
├ solver: HiGHS
├ objective_sense: FEASIBILITY_SENSE
├ num_variables: 0
├ num_constraints: 0
└ Names registered in the model: none
Alternatively, call set_optimizer
at any point before calling optimize!
:
julia> model = Model()
A JuMP Model
├ solver: none
├ objective_sense: FEASIBILITY_SENSE
├ num_variables: 0
├ num_constraints: 0
└ Names registered in the model: none
julia> set_optimizer(model, HiGHS.Optimizer)
For some solvers, you can also use direct_model
, which offers a more efficient connection to the underlying solver:
julia> model = direct_model(HiGHS.Optimizer())
A JuMP Model
├ mode: DIRECT
├ solver: HiGHS
├ objective_sense: FEASIBILITY_SENSE
├ num_variables: 0
├ num_constraints: 0
└ Names registered in the model: none
Some solvers do not support |
Solver Options
Pass options to solvers with optimizer_with_attributes
:
julia> model =
Model(optimizer_with_attributes(HiGHS.Optimizer, "output_flag" => false))
A JuMP Model
├ solver: HiGHS
├ objective_sense: FEASIBILITY_SENSE
├ num_variables: 0
├ num_constraints: 0
└ Names registered in the model: none
These options are solver-specific. To find out the various options available, see the GitHub README of the individual solver packages. The link to each solver’s GitHub page is in the Supported solvers table. |
You can also pass options with set_attribute
:
julia> model = Model(HiGHS.Optimizer)
A JuMP Model
├ solver: HiGHS
├ objective_sense: FEASIBILITY_SENSE
├ num_variables: 0
├ num_constraints: 0
└ Names registered in the model: none
julia> set_attribute(model, "output_flag", false)
Solution basics
We saw above how to use termination_status
and primal_status
to understand the solution returned by the solver.
However, only query solution attributes like value
and objective_value
if there is an available solution. Here’s a recommended way to check:
julia> function solve_infeasible()
model = Model(HiGHS.Optimizer)
@variable(model, 0 <= x <= 1)
@variable(model, 0 <= y <= 1)
@constraint(model, x + y >= 3)
@objective(model, Max, x + 2y)
optimize!(model)
if !is_solved_and_feasible(model)
@warn("The model was not solved correctly.")
return
end
return value(x), value(y)
end
solve_infeasible (generic function with 1 method)
julia> solve_infeasible()
Running HiGHS 1.7.2 (git hash: 5ce7a2753): Copyright (c) 2024 HiGHS under MIT licence terms
Coefficient ranges:
Matrix [1e+00, 1e+00]
Cost [1e+00, 2e+00]
Bound [1e+00, 1e+00]
RHS [3e+00, 3e+00]
Presolving model
Problem status detected on presolve: Infeasible
Model status : Infeasible
Objective value : 0.0000000000e+00
HiGHS run time : 0.00
ERROR: No LP invertible representation for getDualRay
┌ Warning: The model was not solved correctly.
└ @ Main REPL[1]:9
Variable basics
Let’s create a new empty model to explain some of the variable syntax:
julia> model = Model()
A JuMP Model
├ solver: none
├ objective_sense: FEASIBILITY_SENSE
├ num_variables: 0
├ num_constraints: 0
└ Names registered in the model: none
Variable bounds
All of the variables we have created till now have had a bound. We can also create a free variable.
julia> @variable(model, free_x)
free_x
While creating a variable, instead of using the <= and >= syntax, we can also use the lower_bound
and upper_bound
keyword arguments.
julia> @variable(model, keyword_x, lower_bound = 1, upper_bound = 2)
keyword_x
We can query whether a variable has a bound using the has_lower_bound
and has_upper_bound
functions. The values of the bound can be obtained using the lower_bound
and upper_bound
functions.
julia> has_upper_bound(keyword_x)
true
julia> upper_bound(keyword_x)
2.0
Note querying the value of a bound that does not exist will result in an error.
julia> lower_bound(free_x)
Variable free_x does not have a lower bound.
Containers
We have already seen how to add a single variable to a model using the @variable
macro. Now let’s look at ways to add multiple variables to a model.
JuMP provides data structures for adding collections of variables to a model. These data structures are referred to as containers and are of three types: Arrays
, DenseAxisArrays
, and SparseAxisArrays
.
Arrays
JuMP arrays are created when you have integer indices that start at 1
:
julia> @variable(model, a[1:2, 1:2])
2×2 Matrix{VariableRef}:
a[1,1] a[1,2]
a[2,1] a[2,2]
Index elements in a
as follows:
julia> a[1, 1]
a[1,1]
julia> a[2, :]
2-element Vector{VariableRef}:
a[2,1]
a[2,2]
Create an n-dimensional variable with bounds ( ) as follows:
julia> n = 10
10
julia> l = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10];
julia> u = [10, 11, 12, 13, 14, 15, 16, 17, 18, 19];
julia> @variable(model, l[i] <= x[i = 1:n] <= u[i])
10-element Vector{VariableRef}:
x[1]
x[2]
x[3]
x[4]
x[5]
x[6]
x[7]
x[8]
x[9]
x[10]
We can also create variable bounds that depend upon the indices:
julia> @variable(model, y[i = 1:2, j = 1:2] >= 2i + j)
2×2 Matrix{VariableRef}:
y[1,1] y[1,2]
y[2,1] y[2,2]
DenseAxisArrays
DenseAxisArrays
are used when the indices are not one-based integer ranges. The syntax is similar except with an arbitrary vector as an index as opposed to a one-based range:
julia> @variable(model, z[i = 2:3, j = 1:2:3] >= 0)
2-dimensional DenseAxisArray{VariableRef,2,...} with index sets:
Dimension 1, 2:3
Dimension 2, 1:2:3
And data, a 2×2 Matrix{VariableRef}:
z[2,1] z[2,3]
z[3,1] z[3,3]
Indices do not have to be integers. They can be any Julia type:
julia> @variable(model, w[1:5, ["red", "blue"]] <= 1)
2-dimensional DenseAxisArray{VariableRef,2,...} with index sets:
Dimension 1, Base.OneTo(5)
Dimension 2, ["red", "blue"]
And data, a 5×2 Matrix{VariableRef}:
w[1,red] w[1,blue]
w[2,red] w[2,blue]
w[3,red] w[3,blue]
w[4,red] w[4,blue]
w[5,red] w[5,blue]
Index elements in a DenseAxisArray
as follows:
julia> z[2, 1]
z[2,1]
julia> w[2:3, ["red", "blue"]]
2-dimensional DenseAxisArray{VariableRef,2,...} with index sets:
Dimension 1, [2, 3]
Dimension 2, ["red", "blue"]
And data, a 2×2 Matrix{VariableRef}:
w[2,red] w[2,blue]
w[3,red] w[3,blue]
See Forcing the container type for more details.
SparseAxisArrays
SparseAxisArrays
are created when the indices do not form a Cartesian product. For example, this applies when indices have a dependence upon previous indices (called triangular indexing):
julia> @variable(model, u[i = 1:2, j = i:3])
SparseAxisArray{VariableRef, 2, Tuple{Int64, Int64}} with 5 entries:
[1, 1] = u[1,1]
[1, 2] = u[1,2]
[1, 3] = u[1,3]
[2, 2] = u[2,2]
[2, 3] = u[2,3]
We can also conditionally create variables by adding a comparison check that depends upon the named indices and is separated from the indices by a semi-colon ;
:
julia> @variable(model, v[i = 1:9; mod(i, 3) == 0])
SparseAxisArray{VariableRef, 1, Tuple{Int64}} with 3 entries:
[3] = v[3]
[6] = v[6]
[9] = v[9]
Index elements in a DenseAxisArray
as follows:
julia> u[1, 2]
u[1,2]
julia> v[[3, 6]]
SparseAxisArray{VariableRef, 1, Tuple{Int64}} with 2 entries:
[3] = v[3]
[6] = v[6]
Integrality
JuMP can create binary and integer variables. Binary variables are constrained to the set , and integer variables are constrained to the set .
Constraint basics
We’ll need a new model to explain some of the constraint basics:
julia> model = Model()
A JuMP Model
├ solver: none
├ objective_sense: FEASIBILITY_SENSE
├ num_variables: 0
├ num_constraints: 0
└ Names registered in the model: none
julia> @variable(model, x)
x
julia> @variable(model, y)
y
julia> @variable(model, z[1:10]);
Containers
Just as we had containers for variables, JuMP also provides Arrays
, DenseAxisArrays
, and SparseAxisArrays
for storing collections of constraints. Examples for each container type are given below.
Arrays
Create an Array
of constraints:
julia> @constraint(model, [i = 1:3], i * x <= i + 1)
3-element Vector{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.LessThan{Float64}}, ScalarShape}}:
x ≤ 2
2 x ≤ 3
3 x ≤ 4
DenseAxisArrays
Create an DenseAxisArray
of constraints:
julia> @constraint(model, [i = 1:2, j = 2:3], i * x <= j + 1)
2-dimensional DenseAxisArray{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.LessThan{Float64}}, ScalarShape},2,...} with index sets:
Dimension 1, Base.OneTo(2)
Dimension 2, 2:3
And data, a 2×2 Matrix{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.LessThan{Float64}}, ScalarShape}}:
x ≤ 3 x ≤ 4
2 x ≤ 3 2 x ≤ 4
SparseAxisArrays
Create an SparseAxisArray
of constraints:
julia> @constraint(model, [i = 1:2, j = 1:2; i != j], i * x <= j + 1)
SparseAxisArray{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.LessThan{Float64}}, ScalarShape}, 2, Tuple{Int64, Int64}} with 2 entries:
[1, 2] = x ≤ 3
[2, 1] = 2 x ≤ 2
Constraints in a loop
We can add constraints using regular Julia loops:
julia> for i in 1:3
@constraint(model, 6x + 4y >= 5i)
end
or use for each loops inside the @constraint
macro:
julia> @constraint(model, [i in 1:3], 6x + 4y >= 5i)
3-element Vector{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.GreaterThan{Float64}}, ScalarShape}}:
6 x + 4 y ≥ 5
6 x + 4 y ≥ 10
6 x + 4 y ≥ 15
We can also create constraints such as :
julia> @constraint(model, sum(z[i] for i in 1:10) <= 1)
z[1] + z[2] + z[3] + z[4] + z[5] + z[6] + z[7] + z[8] + z[9] + z[10] ≤ 1
Objective functions
Set an objective function with @objective
:
julia> model = Model(HiGHS.Optimizer)
A JuMP Model
├ solver: HiGHS
├ objective_sense: FEASIBILITY_SENSE
├ num_variables: 0
├ num_constraints: 0
└ Names registered in the model: none
julia> @variable(model, x >= 0)
x
julia> @variable(model, y >= 0)
y
julia> @objective(model, Min, 2x + y)
2 x + y
Create a maximization objective using Max
:
julia> @objective(model, Max, 2x + y)
2 x + y
Calling |
Vectorized syntax
We can also add constraints and an objective to JuMP using vectorized linear algebra. We’ll illustrate this by solving an LP in standard form that is,
julia> vector_model = Model(HiGHS.Optimizer)
A JuMP Model
├ solver: HiGHS
├ objective_sense: FEASIBILITY_SENSE
├ num_variables: 0
├ num_constraints: 0
└ Names registered in the model: none
julia> A = [1 1 9 5; 3 5 0 8; 2 0 6 13]
3×4 Matrix{Int64}:
1 1 9 5
3 5 0 8
2 0 6 13
julia> b = [7, 3, 5]
3-element Vector{Int64}:
7
3
5
julia> c = [1, 3, 5, 2]
4-element Vector{Int64}:
1
3
5
2
julia> @variable(vector_model, x[1:4] >= 0)
4-element Vector{VariableRef}:
x[1]
x[2]
x[3]
x[4]
julia> @constraint(vector_model, A * x .== b)
3-element Vector{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.EqualTo{Float64}}, ScalarShape}}:
x[1] + x[2] + 9 x[3] + 5 x[4] = 7
3 x[1] + 5 x[2] + 8 x[4] = 3
2 x[1] + 6 x[3] + 13 x[4] = 5
julia> @objective(vector_model, Min, c' * x)
x[1] + 3 x[2] + 5 x[3] + 2 x[4]
julia> optimize!(vector_model)
Running HiGHS 1.7.2 (git hash: 5ce7a2753): Copyright (c) 2024 HiGHS under MIT licence terms
Coefficient ranges:
Matrix [1e+00, 1e+01]
Cost [1e+00, 5e+00]
Bound [0e+00, 0e+00]
RHS [3e+00, 7e+00]
Presolving model
3 rows, 4 cols, 10 nonzeros 0s
3 rows, 4 cols, 10 nonzeros 0s
Presolve : Reductions: rows 3(-0); columns 4(-0); elements 10(-0) - Not reduced
Problem not reduced by presolve: solving the LP
Using EKK dual simplex solver - serial
Iteration Objective Infeasibilities num(sum)
0 0.0000000000e+00 Pr: 3(13.5) 0s
4 4.9230769231e+00 Pr: 0(0) 0s
Model status : Optimal
Simplex iterations: 4
Objective value : 4.9230769231e+00
HiGHS run time : 0.00
julia> @assert is_solved_and_feasible(vector_model)
julia> objective_value(vector_model)
4.923076923076922