Fast Randomized Algorithms for Convex Optimization and Statistical Estimation

Mert Pilanci

EECS Department
University of California, Berkeley
Technical Report No. UCB/EECS-2016-147
August 14, 2016

http://www2.eecs.berkeley.edu/Pubs/TechRpts/2016/EECS-2016-147.pdf

With the advent of massive datasets, statistical learning and information processing techniques are expected to enable exceptional possibilities for engineering, data intensive sciences and better decision making. Unfortunately, existing algorithms for mathematical optimization, which is the core component in these techniques, often prove ineffective for scaling to the extent of all available data. In recent years, randomized dimension reduction has proven to be a very powerful tool for approximate computations over large datasets. In this thesis, we consider random projection methods in the context of general convex optimization problems on massive datasets. We explore many applications in machine learning, statistics and decision making and analyze various forms of randomization in detail. The central contributions of this thesis are as follows: (i) We develop random projection methods for convex optimization problems and establish fundamental trade-offs between the size of the projection and accuracy of solution in convex optimization. (ii) We characterize information-theoretic limitations of methods that are based on random projection, which surprisingly shows that the most widely used form of random projection is, in fact, statistically sub-optimal. (iii) We present novel methods, which iteratively refine the solutions to achieve statistical optimality and enable solving large scale optimization and statistical inference problems orders-of-magnitude faster than existing methods. (iv) We develop new randomized methodologies for relaxing cardinality constraints in order to obtain checkable and more accurate approximations than the state of the art approaches.

Advisor: Laurent El Ghaoui and Martin Wainwright


BibTeX citation:

@phdthesis{Pilanci:EECS-2016-147,
    Author = {Pilanci, Mert},
    Title = {Fast Randomized Algorithms for Convex Optimization and Statistical Estimation},
    School = {EECS Department, University of California, Berkeley},
    Year = {2016},
    Month = {Aug},
    URL = {http://www2.eecs.berkeley.edu/Pubs/TechRpts/2016/EECS-2016-147.html},
    Number = {UCB/EECS-2016-147},
    Abstract = {With the advent of massive datasets, statistical learning and information processing techniques are expected to enable exceptional possibilities for engineering, data intensive sciences and better decision making. Unfortunately, existing algorithms for mathematical optimization, which is the core component in these techniques, often prove ineffective for scaling to the extent of all available data. In recent years, randomized dimension reduction has proven to be a very powerful tool for approximate computations over large datasets. In this thesis, we consider random projection methods in the context of general convex optimization problems on massive datasets. We explore many applications in machine learning, statistics and decision making and analyze various forms of randomization in detail. The central contributions of this thesis are as follows: (i) We develop random projection methods for convex optimization problems and establish fundamental trade-offs between the size of the projection and accuracy of solution in convex optimization. (ii) We characterize information-theoretic limitations of methods that are based on random projection, which surprisingly shows that the most widely used form of random projection is, in fact, statistically sub-optimal. (iii) We present novel methods, which iteratively refine the solutions to achieve statistical optimality and enable solving large scale optimization and statistical inference problems orders-of-magnitude faster than existing methods. (iv) We develop new randomized methodologies for relaxing cardinality constraints in order to obtain checkable and more accurate approximations than the  state of the art approaches.}
}

EndNote citation:

%0 Thesis
%A Pilanci, Mert
%T Fast Randomized Algorithms for Convex Optimization and Statistical Estimation
%I EECS Department, University of California, Berkeley
%D 2016
%8 August 14
%@ UCB/EECS-2016-147
%U http://www2.eecs.berkeley.edu/Pubs/TechRpts/2016/EECS-2016-147.html
%F Pilanci:EECS-2016-147