Geoffrey Zweig and Stuart Russell

EECS Department, University of California, Berkeley

Technical Report No. UCB/CSD-97-970

, 1997

http://www2.eecs.berkeley.edu/Pubs/TechRpts/1997/CSD-97-970.pdf

Dynamic probabilistic networks (DPNs) are a powerful and efficient method for encoding stochastic temporal models. In the past, however, their use has been largely confined to the description of uniform temporal processes. In this paper we show how to combine specialized DPN models to represent inhomogeneous processes that progress through a sequence of different stages. We develop a method that takes a set of DPN submodels and a stochastic finite state automaton that defines a legal set of submodel concatenations, and constructs a composite DPN. The composite DPN is shown to represent correctly the intended probability distribution over possible histories of the temporal process. The use of DPNs allows us to take advantage of efficient, general-purpose inference and learning algorithms and can confer significant advantages over HMMs in terms of statistical efficiency and representational flexibility. We illustrate these advantages in the context of speech recognition.


BibTeX citation:

@techreport{Zweig:CSD-97-970,
    Author= {Zweig, Geoffrey and Russell, Stuart},
    Title= {Compositional Modeling with DPNs},
    Year= {1997},
    Month= {Sep},
    Url= {http://www2.eecs.berkeley.edu/Pubs/TechRpts/1997/5840.html},
    Number= {UCB/CSD-97-970},
    Abstract= {Dynamic probabilistic networks (DPNs) are a powerful and efficient method for encoding stochastic temporal models. In the past, however, their use has been largely confined to the description of uniform temporal processes. In this paper we show how to combine specialized DPN models to represent inhomogeneous processes that progress through a sequence of different stages. We develop a method that takes a set of DPN submodels and a stochastic finite state automaton that defines a legal set of submodel concatenations, and constructs a composite DPN. The composite DPN is shown to represent correctly the intended probability distribution over possible histories of the temporal process. The use of DPNs allows us to take advantage of efficient, general-purpose inference and learning algorithms and can confer significant advantages over HMMs in terms of statistical efficiency and representational flexibility. We illustrate these advantages in the context of speech recognition.},
}

EndNote citation:

%0 Report
%A Zweig, Geoffrey 
%A Russell, Stuart 
%T Compositional Modeling with DPNs
%I EECS Department, University of California, Berkeley
%D 1997
%@ UCB/CSD-97-970
%U http://www2.eecs.berkeley.edu/Pubs/TechRpts/1997/5840.html
%F Zweig:CSD-97-970