Maximum Entropy Probabilistic Logic

Mark A. Paskin

EECS Department
University of California, Berkeley
Technical Report No. UCB/CSD-01-1161
2002

http://www2.eecs.berkeley.edu/Pubs/TechRpts/2001/CSD-01-1161.pdf

Recent research has shown there are two types of uncertainty that can be expressed in first-order logic -- propositional and statistical uncertainty -- and that both types can be represented in terms of probability spaces. However, these efforts have fallen short of providing a general account of how to design probability measures for these spaces; as a result, we lack a crucial component of any system that reasons under these types of uncertainty. In this paper, we describe an automatic procedure for defining such measures in terms of a probabilistic knowledge base. In particular, we employ the principle of maximum entropy to select measures that are consistent with our knowledge and that make the fewest assumptions in doing so. This approach yields models of first-order uncertainty that are principled, intuitive, and economical in their representation.

\"Edit"; ?>


BibTeX citation:

@techreport{Paskin:CSD-01-1161,
    Author = {Paskin, Mark A.},
    Title = {Maximum Entropy Probabilistic Logic},
    Institution = {EECS Department, University of California, Berkeley},
    Year = {2002},
    URL = {http://www2.eecs.berkeley.edu/Pubs/TechRpts/2002/5717.html},
    Number = {UCB/CSD-01-1161},
    Abstract = {Recent research has shown there are two types of uncertainty that can be expressed in first-order logic -- propositional and statistical uncertainty -- and that both types can be represented in terms of probability spaces. However, these efforts have fallen short of providing a general account of how to design probability measures for these spaces; as a result, we lack a crucial component of any system that reasons under these types of uncertainty. In this paper, we describe an automatic procedure for defining such measures in terms of a probabilistic knowledge base. In particular, we employ the principle of maximum entropy to select measures that are consistent with our knowledge and that make the fewest assumptions in doing so. This approach yields models of first-order uncertainty that are principled, intuitive, and economical in their representation.}
}

EndNote citation:

%0 Report
%A Paskin, Mark A.
%T Maximum Entropy Probabilistic Logic
%I EECS Department, University of California, Berkeley
%D 2002
%@ UCB/CSD-01-1161
%U http://www2.eecs.berkeley.edu/Pubs/TechRpts/2002/5717.html
%F Paskin:CSD-01-1161