Informs Annual Meeting 2017

SB83

INFORMS Houston – 2017

3 - On Robust Solutions to Uncertain Hierarchical Transmission Expansion Problems Shisheng Cui, Pennsylvania State University, University Park, PA, United States, cuishisheng05@gmail.com, Uday Shanbhag We consider a robust formulation for proactive planning of transmission with imperfect long-run transmission pricing based on a MW-km charging system. The deterministic model is described by a mixed-binary mathematical program with equilibrium constraints (MPEC), a challenging optimization problem. Then, leveraging techniques from robust solutions to uncertain complementary problems, we incorporate robustness into the transmission expansion problem and obtain global solutions via a branching procedure. Preliminary numerics will be provided. 4 - The Value of Flexibility in Robust Location -Transportation Problems Amir Ardestani-Jaafari, McGill University, Montreal, QC, Canada, amir.ardestanijaafari@mail.mcgill.ca, Erick Delage We study a robust location-transportation problem in which, while the location and capacity of each facility need to be determined immediately, the determination of final production and distribution of products can be delayed until actual orders are received. It is a computationally challenging task to make wait-and-see decisions adaptable to data while the problem remains practically attractive. To overcome this difficulty, we propose a tractable conservative approximation to the problem that provides a promising trade-off between tractability and wait-and-see decisions flexibility. 382B New Directions in Stochastic Programming Sponsored: Optimization, Optimization Under Uncertainty Sponsored Session Chair: Suvrajeet Sen, University of Southern California, University of Southern California, Los Angeles, CA, 90089-0193, United States, s.sen@usc.edu 1 - Algebraic Topics for Stochastic Programming Ruediger Schultz, University of Duisburg Essen, Department of Mathematics, Thea-Leymann-Str. 9, Essen, D-45127, Germany, ruediger.schultz@uni-due.de In the talk, selected topics from stochastic programming are discussed, where structural and algorithmic findings benefit from an algebraic point-of-view. In particular, augmentation algorithms for two-stage stochastic integer programs and polynomial optimization models for stationary flows in gas pipelines are forming the departure points for our considerations. 2 - Multistage Stochastic Programming with Parametric Cost Function Approximations Raymond Perkins, Princeton University, Princeton, NJ, United States, raymondp@princeton.edu, Warren Powell A widely used heuristic for solving stochastic optimization problems is to use a deterministic rolling horizon procedure which has been modified to handle uncertainty (e.g. buffer stocks, schedule slack). We formalize this approach by proposing these parameterizations as a powerful strategy to hedge against uncertainty, and present a stochastic gradient algorithm for optimizing the parameters in a fully sequential, stochastic base model. This base model, implemented as a simulator, does not require any of the standard assumptions that are widely used in stochastic programming such as two-stage models or intertemporal independence. We derive the gradients and test the algorithm on several parameterizations, demonstrating significant improvements over a vanilla deterministic lookahead. 3 - On the Asymptotical Result of Two-stage Stochastic Quadratic Programming Junyi Liu, University of Southern California, Los Angeles, CA, United States, junyiliu@usc.edu We consider a class of two-stage stochastic quadratic programming (STQP) problems with continuous random variables. To accommodate the above structure, Stochastic Decomposition (SD) algorithm constructs affine/quadratic minorants and approaches a statistically acceptable solution. On the convergence side, we present the contraction property of stochastic proximal mapping and show the linear convergence to the neighborhood of the optimum wp1. SB82

4 - An Extension to the Network Flow Optimization Problem Gerrit Slevogt, Universität Duisburg-Essen, Duisburg, Germany, gerrit.slevogt@uni-due.de A transient gas network model with the highlight of active elements being modelled multiplicatively at the nodes is extended to stochastic demand and its structure is presented. Going beyond existing literature the modeling of active components is not restricted to compressor stations but includes all common active gas network elements.

SB83

382C Statistical Learning Sponsored: Optimization, Optimization Under Uncertainty Sponsored Session Chair: Sam Davanloo Tajbakhsh, davanloo-tajbakhsh.1@osu.edu 1 - Symmetry, Saddle Points, and Global Geometry of Non-convex Matrix Factorization Xingguo Li, University of Minnesota, Minneapolis, MN, 30318, United States, lixx1661@umn.edu We propose a general theory for the geometry of nonconvex objectives by characterizing stationary points and the null space of the associated Hessian via invariant groups. Moreover, we characterize the global geometry of low-rank matrix factorization and illustrate how the rotational symmetry group raises infinite strict saddle and global minima. We divide the parameter space into the regions: (R1) containing strict saddle with negative curvature; (R2) containing global minima with restricted strong convexity; and (R3) with large gradient. We further extend our result to matrix sensing. This allows us to show global convergence for popular iterative algorithms with random initialization. 2 - Convex Optimization Problems with Misspecified Vector of Parameters: Solution Framework and Rate Analysis Hesamoddin Ahmadi, Innovative Scheduling DBA Optym, 5333 SW. 75th street Apt L71, Gainesville, FL, 32608, United States, hesamoddin.ahmadi@optym.com Hesamoddin Ahmadi, Pennsylvania State University, University Park, PA, United States, hesamoddin.ahmadi@optym.com In this talk, we consider a misspecied optimization problem that requires minimizing a function f over a closed and convex set X with an unknown vector of parameters. The source of misspecification is a deterministic vector that can be obtained by solving another suitably defined optimization problem in parallel, referred to as the learning problem. We develop joint first-order schemes for computation and learning and provide rate statements. 3 - Sample Average Approximation with Sparsity-inducing Penalty for High-dimensional Stochastic Programming Hongcheng Liu, Stanford University, Stanford, CA, 94085, United States, hql5143@stanford.edu, Xue Wang, Tao Yao, Runze Li, Yinyu Ye Traditional sample average approximation (SAA) scheme for stochastic programming requires that the number of samples should be polynomial in the number of problem dimensions in order to ensure proper optimization accuracy. We study a modification to SAA in the scenario where the global minimizer is either sparse. By making use of a regularization penalty referred to as the folded concave penalty, we show that the required number of samples can be significantly reduced: the sample size is only required to be poly-logarithmic in the number of dimensions. 4 - On Statistical Learning Problems with Nonconvex L0-Norm Penalty Formulated by Complementarity Constraints Sam Davanloo Tajbakhsh, The Ohio State University, 210 Baker Systems Building, 1971 Neil Ave, Columbus, OH, 43210, United States, davanloo-tajbakhsh.1@osu.edu Imposing sparsity in high-dimensional statistical learning could be performed by imposing nonconvex l0 norm penalty. From the statistical perspective, some local solutions to nonconvex problems could be superior to global solutions of convex counterparts. In this talk, we propose our findings to solve the original nonconvex problem formulated by complementarity constraints.

65

Made with FlippingBook flipbook maker