Use of LinearProgramming
LinearOptimization
can accept the objective and constraints in this form
LinearOptimization[t, {2 x + y + 2 < t, -x - 2 y + 3 < t,
-3 x + y < t, 2 x - 3 y < t}, {t, x, y}]
(* {t -> 29/15, x -> -(2/5), y -> 11/15} *)
N[%]
(* {t -> 1.93333, x -> -0.4, y -> 0.733333} *)
and it can also convert them to the matrix formulation, for example
obj = LinearOptimization[t, {2 x + y + 2 < t, -x - 2 y + 3 < t,
-3 x + y < t, 2 x - 3 y < t}, {t, x, y}, "ObjectiveVector"]
(* {1, 0, 0} *)
{a, b} = Normal @ LinearOptimization[t, {2 x + y + 2 < t, -x - 2 y + 3 < t,
-3 x + y < t, 2 x - 3 y < t}, {t, x, y}, "LinearInequalityConstraints"]
(* {{{1, -2, -1}, {1, 1, 2}, {1, 3, -1}, {1, -2, 3}}, {-2, -3, 0, 0}} *)
Of course it is possible to feed these into LinearOptimization
:
LinearOptimization[obj, {a, b}]
(* {29/15, -(2/5), 11/15} *)
or into LinearProgramming
:
LinearProgramming[obj, a, -b, None]
(* {29/15, -(2/5), 11/15} *)
but in general LinearOptimization
is the more modern and flexible function.
Taking your variables in the order {t, x, y}
, you can rewrite your constraints as follows, which makes it easier (at least for me) to turn them into matrix form (as shown in comments for each line):
t - 2 x - y > 2 (* {1, -2, -1} *)
t + x + 2 y > 3 (* {1, 1, 2} *)
t + 3 x - y > 0 (* {1, 3, -1} *)
t - 2 x + 3 y > 0 (* {1, -2, 3} *)
Then your target function $t$ is represented by coefficients {1, 0, 0}
in the variables {t, x, y}
, respectively.
With this, you can write:
LinearProgramming[
{1, 0, 0},
{{1, -2, -1},
{1, 1, 2},
{1, 3, -1},
{1, -2, 3}},
{2, 3, 0, 0},
None
]
(* Out: {29/15, -(2/5), 11/15} *)
The result above is exact, because exact coefficients were provided, but of course it is numerically the same as the one reported by FindMinimum
.