Informs Annual Meeting 2017

TC76

INFORMS Houston – 2017

TC74

2 - Taming Nonconvexity with Data Zhaoran Wang, PhD, Princeton, NJ, United States, zhaoranwang@gmail.com

372C Optimization in Computational Geometry Sponsored: Computing Sponsored Session

In this talk, I will illustrate how statistical thinking enables us to harness the power of nonconvex optimization. In specific, I will present an algorithmic framework for exploiting the latent geometry induced by the randomness of data. By integrating three new global exploration meta-algorithms — namely, homotopy continuation, tightening after relaxation, and noise regularization — with local search heuristics — such as the variants of gradient descent — this unified framework leads to new nonconvex optimization algorithms for a broad variety of challenging learning problems. 3 - Proximally Guided Stochastic Subgradient Method for Nonsmooth, Nonconvex Problems Damek Davis, Cornell University, 218 Rhodes Hall, Hoy Road, Ithaca, NY, 14850, United States, dsd95@cornell.edu We introduce a stochastic projected subgradient method for a wide class of nonsmooth, nonconvex functions, which includes the additive and convex composite classes. At a high-level the method is an inexact proximal point iteration in which the strongly convex proximal subproblems are quickly solved with a specialized stochastic projected subgradient method. The primary contribution of this work is a simple proof that the proposed algorithm converges at the same rate as the stochastic gradient method for smooth nonconvex problems. This result validates the use of stochastic subgradient methods in nonsmooth, nonconvex optimization as is common when optimizing neural networks. 4 - The Radial Subgradient Method Benjamin Grimmer, Cornell University, Ithaca, NY, United States, bdg79@cornell.edu We present a subgradient method for minimizing non-smooth, non-Lipschitz convex optimization problems. We extend the work of Renegar by taking a different perspective, leading to an algorithm which is conceptually more natural, has notably improved convergence rates, and for which the analysis is surprisingly simple. At each iteration, the algorithm takes a subgradient step and then performs a line search to move radially towards (or away from) a known feasible point. Our convergence results have striking similarities to those of traditional methods that require Lipschitz continuity. Costly orthogonal projections typical of subgradient methods are entirely avoided. 372E Leveraging Structure in Discrete Optimization Sponsored: Optimization, Integer and Discrete Optimization Sponsored Session Chair: Christian Tjandraatmadja, Carnegie Mellon University, Pittsburgh, PA, 15213, United States, ctjandra@andrew.cmu.edu 1 - On Relaxations of Products of Functions Taotao He, 1303 Palmer Drive, Apt 212, West Lafayette, IN, 47906, United States, he135@purdue.edu, Mohit Tawarmalani We develop a new relaxation that exploits function structure while convexifying a product of d functions. The function structure is encapsulated using at most n over and underestimators. We convexify the function product in the space of estimators. The separation procedure generates facet-defining inequalities in time polynomial in d. If the functions are non-negative, the concave envelope can be separated in O(d nlog(n)). We extend our construction to infinite families of under and overestimators. Finally, we interpret the relaxation procedure for non- negative functions as expressing the product as a telescoping sum followed by a simple relaxation operator. 2 - Incorporating Decision Diagrams into MIP Solving Christian Tjandraatmadja, Carnegie Mellon University, Tepper School of Business, 5000 Forbes Avenue, Pittsburgh, PA, 15213, United States, ctjandra@andrew.cmu.edu, Willem-Jan Van Hoeve We present techniques that use decision diagrams to enhance mixed-integer programming solvers. Decision diagrams are embedded into the branch-and- bound tree and yield cutting planes and bounds that aid the solving process. We provide computational experiments that show these methods are effective in situations that admit strong decision diagram relaxations. TC76

Chair: Paul Brooks, Virginia Commonwealth University, Richmond, VA, 23284, United States, jpbrooks@vcu.edu 1 - Snug Simplices Ghasemali Salmani Jajaei, Virginia Commonwealth University, Richmond, VA, United States, salmanijajaeg@vcu.edu We propose procedures for enclosing convex hulls of finite m-dimensional point sets with simplcies. A simplex is called snug if it intersects the convex hull in some way. The procedures will assess how the snug simplices generated contain the convex hulls by comparing their volumes. In general, calculating the volume of general polyhedra is difficult, however, we are able to find closed form formulas for certain simplices. Our goal is to find a tight simplex, one in which its facets coincide perfectly with the facets of the convex hull. 2 - Approximate Convex Hulls Robert Graham, McGill University, Montreal, QC, Canada, robert.graham2@mail.mcgill.ca This talk will cover joint work with Adam Oberman on the pixel purity algorithm for computing approximate convex hulls. I will explain how the algorithm computes the curvature of points and prove consistency and convergence. I will also extend the algorithm to compute approximate convex hulls described in terms of hyperplanes 3 - Linear Optimization Over the Field of Puiseux Fractions Benjamin Schroter, TU. Berlin, Berlin, Germany, schroeter@math.tu-berlin.de The field of real Puiseux Series is a real closed field which extends the real numbers by an infinitesimal element t. This field is again totally ordered, i.e, t is smaller than any positive real number but larger than zero. In particular Puiseux Series allow to study aspects of algebraic geometry via polyhedral geometry. They are an important link between linear optimization and tropical optimization. We present our implementation of a well suited subfield for exact computations, which we call the field of Puiseux Fractions. Our implementation in the software polymake allows to compute linear programs and convex hulls of polyhedra over rationals extended with the indeterminate t. 4 - The Recommender Problem with Convex Hulls Marie-Laure Bougnol, Professor of DSIM, Jacksonville University, mbougno@ju.edu, Jose H. Dula Consider a collection of users and items they may score using an ordinal scale. Consider a specific user and an item yet unscored by this user. The challenge in this version of the recommender problem is to estimate this user’s score for the item. An idea central to this problem is the identification of “like-minded peers” and use their scores to make the estimate We propose procedures based on computationally geometry to identify peers and use them to estimate scores. 372D Nonsmooth Optimization Sponsored: Optimization, Nonlinear Programming Sponsored Session Chair: Damek Davis, dsd95@cornell.edu 1 - Generic Acceleration Schema Beyond Convexity Courney Paquette, PhD, University of Washington-Seattle, Seattle, WA, United States, cykempton@gmail.com We introduce a generic “acceleration” scheme that can leverage arbitrary linearly convergent convex optimization algorithms to solve possibly nonconvex optimization problems. The proposed approach simultaneously enjoys best known complexity bounds up to log terms for both convex and nonconvex settings. A class of problems for which the technique is not applicable consists of minimizing compositions of nonsmooth convex functions and smooth maps. I will describe how a simple Gauss-Newton algorithm combined with smoothing and fast-gradient subproblem solves yields a scheme with overall efficiency $O(1/e^3log(1/e))$. This appears to be the best know complexity bound for this problem class. TC75

363

Made with FlippingBook flipbook maker