Gibbs sampling in open-universe stochastic languages
Nimar S Arora and Rodrigo de Salvo Braz and Erik Sudderth and Stuart J. Russell
EECS Department, University of California, Berkeley
Technical Report No. UCB/EECS-2010-34
March 27, 2010
http://www2.eecs.berkeley.edu/Pubs/TechRpts/2010/EECS-2010-34.pdf
Languages for open-universe probability models (OUPMs) can represent situations with an unknown number of objects and identity uncertainty, which comprise a very important class of real-world applications. Current general-purpose inference methods for such languages are, however, much less efficient than those implemented for more restricted languages or for specific model classes. This paper goes some way to remedying the deficit by introducing, and proving correct, a general method for Gibbs sampling in partial worlds where model structure may vary across worlds. The method draws on and extends previous results on generic OUPM inference and on auxiliary-variable Gibbs sampling for non-parametric mixture models. It has been implemented for BLOG, a well-known OUPM language. Combined with compile-time optimizations, it yields very substantial speedups over existing methods on several test cases and substantially improves the practicality of OUPM languages generally.
BibTeX citation:
@techreport{Arora:EECS-2010-34, Author= {Arora, Nimar S and de Salvo Braz, Rodrigo and Sudderth, Erik and Russell, Stuart J.}, Title= {Gibbs sampling in open-universe stochastic languages}, Year= {2010}, Month= {Mar}, Url= {http://www2.eecs.berkeley.edu/Pubs/TechRpts/2010/EECS-2010-34.html}, Number= {UCB/EECS-2010-34}, Abstract= {Languages for open-universe probability models (OUPMs) can represent situations with an unknown number of objects and identity uncertainty, which comprise a very important class of real-world applications. Current general-purpose inference methods for such languages are, however, much less efficient than those implemented for more restricted languages or for specific model classes. This paper goes some way to remedying the deficit by introducing, and proving correct, a general method for Gibbs sampling in partial worlds where model structure may vary across worlds. The method draws on and extends previous results on generic OUPM inference and on auxiliary-variable Gibbs sampling for non-parametric mixture models. It has been implemented for BLOG, a well-known OUPM language. Combined with compile-time optimizations, it yields very substantial speedups over existing methods on several test cases and substantially improves the practicality of OUPM languages generally.}, }
EndNote citation:
%0 Report %A Arora, Nimar S %A de Salvo Braz, Rodrigo %A Sudderth, Erik %A Russell, Stuart J. %T Gibbs sampling in open-universe stochastic languages %I EECS Department, University of California, Berkeley %D 2010 %8 March 27 %@ UCB/EECS-2010-34 %U http://www2.eecs.berkeley.edu/Pubs/TechRpts/2010/EECS-2010-34.html %F Arora:EECS-2010-34