Document Type



Doctor of Philosophy


Information and Systems Engineering

First Adviser

Scheinberg, Katya


First-order methods for convex and nonconvex optimization have been an important research topic in the past few years. This talk studies and develops efficient algorithms of first-order type, to solve a variety of problems. We first focus on the widely studied gradient-based methods in composite convex optimization problems that arise extensively in compressed sensing and machine learning. In particular, we discuss an accelerated first-order scheme and its variants, which enjoy the “optimal” convergence rate for the gradient methods in terms of complexity, and their practical behavior.In the second part of the talk, we present alternating direction type of methods solving structured nonlinear nonconvex problems. The problem we are interested in has special structure which allows convenient 2-block variable splitting. Our methods rely on solving convex subproblem and the limit point obtained can be guaranteed to satisfy KKT conditions. Our approach includes the alternating directions method of multipliers (ADMM) and the alternating linearization method (ALM) and we provide convergence rate results for both classes of methods. Moreover, global optimization techniques from polynomial optimization literature are applied to complement our local methods and to provide lower bounds. The application includes some nonconvex problems that have recently arisen in portfolio selection, power system, etc.