## Invited Session Fri.2.H 2051

#### Friday, 13:15 - 14:45 h, Room: H 2051

**Cluster 24: Variational analysis** [...]

### Duality in convex optimization

**Chair: Radu Ioan Bot**

**Friday, 13:15 - 13:40 h, Room: H 2051, Talk 1**

**ErnĂ¶ Robert Csetnek**

Conjugate duality and the control of linear discrete systems

**Coauthor: Radu Ioan Bot**

**Abstract:**

We consider a constrained minimization problem, the function to be minimized being a convex one with values in the extended real line, and the set of constraints is governed by a set valued operator with convex graph. We attach a dual problem to it and we deliver regularity conditions guaranteeing the equality of the optimal objective values of the two problems and we discuss also the existence of optimal solutions. The results are applied to the control of linear discrete systems.

**Friday, 13:45 - 14:10 h, Room: H 2051, Talk 2**

**AndrĂ¨ Heinrich**

The support vector machines approach via Fenchel-type duality

**Coauthors: Radu I. Bot, Gert Wanka**

**Abstract:**

Supervised learning methods are powerful techniques to learn a function from a given set of labeled data, the so-called training data. In this talk the support vector machines approach for classification and regression is investigated under a theoretical point of view that makes use of convex analysis and Fenchel duality. Starting with the corresponding Tikhonov regularization problem, reformulated as a convex optimization problem, we introduce a conjugate dual problem to it and prove that, whenever strong duality holds, the function to be learned can be expressed via the optimal solutions of the dual problem. Corresponding dual problems are then derived for different loss functions for the classification task as well as for the regression task. The theoretical results are applied by numerically solving an image classification task originating from a quality control problem a supplier of the automotive industry was faced with. The accuracy of the resulting classifiers demonstrate the excellent performance of support vector classification based on this high dimensional real-world data.

**Friday, 14:15 - 14:40 h, Room: H 2051, Talk 3**

**Sorin-Mihai Grad**

Classical linear vector optimization duality revisited

**Coauthors: Radu Ioan Bot, Gert Wanka**

**Abstract:**

We introduce a vector dual problem that successfully *cures* the *trouble* encountered by some classical vector duals to the classical linear vector optimization problem in finite-dimensional spaces. This new-old vector dual is based on a vector dual introduced by Bo\c{t} and Wanka for the case when the image space of the objective function of the primal problem is partially ordered by the corresponding nonnegative orthant, extending it for the framework where an arbitrary nontrivial pointed convex cone partially orders the mentioned space. The vector dual problem we propose has, different to other recent contributions to the field which are of set-valued nature, a vector objective function. Weak, strong and converse duality for this vector dual problem are delivered and it is compared with other vector duals considered in the same framework in the literature. We also extend a well-known classical result by showing that the efficient solutions of the classical linear vector optimization problem coincide with its properly efficient solutions (in any sense) when the image space is partially ordered by a nontrivial pointed closed convex cone.