## Invited Session Mon.3.H 2053

#### Monday, 15:15 - 16:45 h, Room: H 2053

**Cluster 9: Global optimization** [...]

### Algorithms and relaxations for nonconvex optimization Problems

**Chair: Jeff Linderoth**

**Monday, 15:15 - 15:40 h, Room: H 2053, Talk 1**

**Bissan Ghaddar**

A global optimization approach for binary polynomial programs

**Coauthor: Juan Vera**

**Abstract:**

In this talk, we present branch-and-dig, an algorithm to find global solutions for binary polynomial programming problems. Inequality generating techniques based on lift-and-project relaxations are developed for binary polynomial problems which can help speed up the branch-and-bound process by improving the bounds at each node, thus reducing the number of nodes of the tree. Computational results for small test problems of degree three are given. In the computational study, we investigate the performance of different branching rules and the impact of the dynamic inequality generation scheme.

**Monday, 15:45 - 16:10 h, Room: H 2053, Talk 2**

**Takahito Kuno**

A class of convergent subdivision strategies in the conical algorithm for concave minimization

**Coauthor: Tomohiro Ishihama**

**Abstract:**

We present a new proof of the convergence of the conical algorithm for concave minimization under a pure *ω*-subdivision strategy.

In 1991, Tuy showed that the conical algorithm with *ω*-subdivision is convergent if a certain kind of nondegeneracy holds for sequences of nested cones generated in the process of the algorithm.

Although the convergence has already been proven in other ways, it still remains an open question whether the sequences are nondegenerate or not.

In this talk, we introduce a weaker condition of nondegeneracy,

named *pseudo-nondegeneracy*, and show that the conical algorithm

with *ω*-subdivision converges as long as the pseudo-nondegeneracy holds for sequences of nested cones generated by the algorithm.

We also show that every sequence generated by the algorithm is pseudo-nondegenerate.

The pseudo-nondegeneracy is not only a useful condition for proving the convergence, but suggests a possible class of convergent subdivision strategies.

**Monday, 16:15 - 16:40 h, Room: H 2053, Talk 3**

**Achim Wechsung**

Improving relaxations of implicit functions

**Coauthor: Paul I. Barton**

**Abstract:**

A factorable function **f**:Y → **R**^{m}, *Y ⊂ ***R**^{n} can be represented as a DAG. While it is natural to construct interval extensions of factorable functions, the DAG representation has been shown to also enable the backward propagation of interval bounds on the function's range, i.e., to provide an enclosure of the intersection of *Y* with the function's pre-image. One application is to eliminate points in the domain where no solution of **f**(**y**)=**0** exists. This idea can be extended to the case of constructing convex relaxations of implicit functions. When *n>m*, it is possible to partition *Y* into *X ⊂ ***R**^{m} and *P ⊂ ***R**^{n-m}. Assuming that *X* and *P* are intervals and that there exists a unique **x**:P → X such that **f**(**x**(**p**),**p**)=**0**, it is then possible to construct relaxations of the implicit function **x** using the DAG representation of **f**, backward propagation and generalized McCormick relaxation theory. These relaxations can be used to initialize other methods that improve relaxations of implicit functions iteratively.