Haoyu Chen and John F. Canny

EECS Department, University of California, Berkeley

Technical Report No. UCB/EECS-2016-39

May 3, 2016

http://www2.eecs.berkeley.edu/Pubs/TechRpts/2016/EECS-2016-39.pdf

A fundamental task in machine learning and related fields is to perform inference on probabilistic graphical models. Since exact inference takes exponential time in general, a variety of approximate methods are used. Gibbs sampling is one of the most accurate approaches and provides unbiased samples from the posterior but it has historically been too expensive for large models. In this paper, we present an optimized, parallel Gibbs sampler augmented with state replication (SAME or State Augmented Marginal Estimation) to decrease convergence time. We find that SAME can improve the quality of parameter estimates while accelerating convergence. Experiments on both synthetic and real data show that our Gibbs sampler is substantially faster than the state of the art sampler, JAGS, without sacrificing accuracy. Our ultimate objective is to introduce the Gibbs sampler to researchers in many fields to expand their range of feasible inference problems.

Advisors: John F. Canny


BibTeX citation:

@mastersthesis{Chen:EECS-2016-39,
    Author= {Chen, Haoyu and Canny, John F.},
    Title= {Fast Parallel SAME Gibbs Sampling on General Discrete Bayesian Networks and Factor Graphs},
    School= {EECS Department, University of California, Berkeley},
    Year= {2016},
    Month= {May},
    Url= {http://www2.eecs.berkeley.edu/Pubs/TechRpts/2016/EECS-2016-39.html},
    Number= {UCB/EECS-2016-39},
    Abstract= {A fundamental task in machine learning and related fields is to perform inference on probabilistic graphical models. Since exact inference takes exponential time in general, a variety of approximate methods are used. Gibbs sampling is one of the most accurate approaches and provides unbiased samples from the posterior but it has historically been too expensive for large models. In this paper, we present an optimized, parallel Gibbs sampler augmented with state replication (SAME or State Augmented Marginal Estimation) to decrease convergence time. We find that SAME can improve the quality of parameter estimates while accelerating convergence. Experiments on both synthetic and real data show that our Gibbs sampler is substantially faster
than the state of the art sampler, JAGS, without sacrificing accuracy. Our ultimate objective is to introduce the Gibbs sampler to researchers in many fields
to expand their range of feasible inference problems.},
}

EndNote citation:

%0 Thesis
%A Chen, Haoyu 
%A Canny, John F. 
%T Fast Parallel SAME Gibbs Sampling on General Discrete Bayesian Networks and Factor Graphs
%I EECS Department, University of California, Berkeley
%D 2016
%8 May 3
%@ UCB/EECS-2016-39
%U http://www2.eecs.berkeley.edu/Pubs/TechRpts/2016/EECS-2016-39.html
%F Chen:EECS-2016-39