## Contributed Session Mon.3.MA 042

#### Monday, 15:15 - 16:45 h, Room: MA 042

**Cluster 11: Integer & mixed-integer programming** [...]

### Linear optimization

**Chair: Angelo Sifaleras**

**Monday, 15:15 - 15:40 h, Room: MA 042, Talk 1**

**Sergei Chubanov**

An improved polynomial relaxation-type algorithm for linear programming

**Abstract:**

To find a solution of a system of linear inequalities, the

classical relaxation method projects the current point, at every iteration, onto a hyperplane defined by a violated constraint. The constructed sequence converges to a feasible solution. It is well known that the method is not polynomial. One of the reasons for this is that each iteration considers only one violated constraint among the original constrains of the system. Unlike the relaxation method, each iteration of our algorithm considers an appropriate nonnegative linear combination of the inequalities. The algorithm runs in *O(n*^{3}L_{min}) time where *n* is the number of variables

and *L*_{min} is the minimum binary size of a feasible solution. In particular, the algorithm either finds a nonnegative solution of a system of linear equations or proves that there are no *0,1*-solutions in *O(n*^{4}) time. This theoretical estimate is less by the factor of *n*^{2} than that of our previous algorithm.

**Monday, 15:45 - 16:10 h, Room: MA 042, Talk 2**

**Roland Wunderling**

The kernel simplex method

**Abstract:**

The Simplex Method has stopped seeing major computational advances for years, yet it remains the most widely used algorithm for solving LPs; in particular the dual Simplex algorithm is used for MIP because of its warm-start capabilities. State-of-the-art MIP solvers use branch-and-cut algorithms, but the standard dual simplex algorithm only addresses the branching aspect of it. When cuts are added usually a fresh factorization of the basis matrix is needed which greatly reduces true warm-start support. Using a row basis or dualization can mitigate the issue, but this is only efficient for models with more rows than columns.

In this talk we introduce a new simplex algorithm, the kernel simplex method (KSM), which defines a kernel instead of a basis as the central data structure. KSM, provides full warm-starting functionality for row and column additions or deletions. We describe the algorithm and differentiate its computational properties against the traditional simplex method. Further, we show how KSM unifies primal and dual algorithms into one symmetric algorithm, thus matching duality theory much better than the traditional methods.

**Monday, 16:15 - 16:40 h, Room: MA 042, Talk 3**

**Angelo Sifaleras**

Exterior point simplex-type algorithms for linear and network optimization problems

**Coauthor: Nikolaos Samaras**

**Abstract:**

The linear problem is one of the most useful and well-studied optimization problems, which is widely used in several areas of science. Lots of real world problems can be formulated as linear programs. The popularity of linear programming can be attributed to many factors such as the ability to model large problems, and the ability to solve large problems in a reasonable amount of time. Many algorithms have been invented for the solution of a linear program. The majority of these algorithms belong to two main categories: (i) Simplex-type or pivoting algorithms and (ii) interior-point methods (IPMs). All the algorithms presented in this paper belong to the first category, except one that belongs to both categories. The first exterior point simplex type algorithm (EPSA) was originally developed by Paparrizos for the assignment problem. EPSA constructs two paths to the optimal solution. One path consists of basic but not feasible solutions, while the second path is feasible. The key idea behind EPSA is that making steps in directions that are linear combinations of attractive descent direction can lead to faster convergence than that achieved by classic simplex type algorithms.