Horia Mania and Xinghao Pan and Dimitris Papailiopoulos and Benjamin Recht and Kannan Ramchandran and Michael Jordan

EECS Department, University of California, Berkeley

Technical Report No. UCB/EECS-2019-12

May 1, 2019

http://www2.eecs.berkeley.edu/Pubs/TechRpts/2019/EECS-2019-12.pdf

We introduce and analyze stochastic optimization methods where the input to each update is perturbed by bounded noise. We show that this framework forms the basis of a unified approach to analyze asynchronous implementations of stochastic optimization algorithms, by viewing them as serial methods operating on noisy inputs. Using our perturbed iterate framework, we provide new analyses of the Hogwild algorithm and asynchronous stochastic coordinate descent, that are simpler than earlier analyses, remove many assumptions of previous models, and in some cases yield improved upper bounds on the convergence rates. We proceed to apply our framework to develop and analyze Kromagnon: a novel, parallel, sparse stochastic variance-reduced gradient (SVRG) algorithm. We demonstrate experimentally on a 16-core machine that the sparse and parallel version of SVRG is in some cases more than four orders of magnitude faster than the standard SVRG algorithm.

Advisors: Michael Jordan and Benjamin Recht


BibTeX citation:

@mastersthesis{Mania:EECS-2019-12,
    Author= {Mania, Horia and Pan, Xinghao and Papailiopoulos, Dimitris and Recht, Benjamin and Ramchandran, Kannan and Jordan, Michael},
    Title= {Perturbed Iterate Analysis for Asynchronous Stochastic Optimization},
    School= {EECS Department, University of California, Berkeley},
    Year= {2019},
    Month= {May},
    Url= {http://www2.eecs.berkeley.edu/Pubs/TechRpts/2019/EECS-2019-12.html},
    Number= {UCB/EECS-2019-12},
    Abstract= {We introduce and analyze stochastic optimization methods where the input to each update is perturbed by bounded noise.  
We show that this framework forms the basis of a unified approach to analyze asynchronous implementations of stochastic optimization algorithms, 
by viewing them as  serial methods operating on noisy inputs.  
Using our perturbed iterate framework, we provide new analyses of the Hogwild algorithm and asynchronous stochastic coordinate descent, that are simpler than earlier analyses, remove many assumptions of previous models, and in some cases yield improved upper bounds on the convergence rates.  
We proceed to apply our framework to develop and analyze Kromagnon: a novel, parallel, sparse stochastic variance-reduced gradient (SVRG) algorithm.  
We demonstrate experimentally on a 16-core machine that the sparse and parallel version of SVRG is in some cases more than four orders of magnitude faster than the standard SVRG algorithm.},
}

EndNote citation:

%0 Thesis
%A Mania, Horia 
%A Pan, Xinghao 
%A Papailiopoulos, Dimitris 
%A Recht, Benjamin 
%A Ramchandran, Kannan 
%A Jordan, Michael 
%T Perturbed Iterate Analysis for Asynchronous Stochastic Optimization
%I EECS Department, University of California, Berkeley
%D 2019
%8 May 1
%@ UCB/EECS-2019-12
%U http://www2.eecs.berkeley.edu/Pubs/TechRpts/2019/EECS-2019-12.html
%F Mania:EECS-2019-12