Kuan-Yun Lee

EECS Department, University of California, Berkeley

Technical Report No. UCB/EECS-2022-37

May 4, 2022

http://www2.eecs.berkeley.edu/Pubs/TechRpts/2022/EECS-2022-37.pdf

We introduce, under a parametric framework, a family of inequalities between mutual information and Fisher information. These inequalities are indexed by reference measures satisfying a log-Sobolev inequality (LSI), and reveal previously unknown connections between LSIs and statistical inequalities. One such connection is shown for the celebrated van Trees inequality by recovering under a Gaussian reference measure a stronger entropic inequality due to Efroimovich. We further present two new inequalities for log-concave priors that do not depend on the Fisher information of the prior and are applicable under certain scenarios where the van Trees inequality and Efroimovich’s inequality cannot be applied. We illustrate a procedure to establish lower bounds on risk under general loss functions, and apply it under several statistical settings, including the Generalized Linear Model and a general pairwise comparison framework.

Advisors: Thomas Courtade


BibTeX citation:

@phdthesis{Lee:EECS-2022-37,
    Author= {Lee, Kuan-Yun},
    Title= {New Information Inequalities with Applications to Statistics},
    School= {EECS Department, University of California, Berkeley},
    Year= {2022},
    Month= {May},
    Url= {http://www2.eecs.berkeley.edu/Pubs/TechRpts/2022/EECS-2022-37.html},
    Number= {UCB/EECS-2022-37},
    Abstract= {We introduce, under a parametric framework, a family of inequalities between mutual information and Fisher information. These inequalities are indexed by reference measures satisfying a log-Sobolev inequality (LSI), and reveal previously unknown connections between LSIs and statistical inequalities. One such connection is shown for the celebrated van Trees inequality by recovering under a Gaussian reference measure a stronger entropic inequality due to Efroimovich. We further present two new inequalities for log-concave priors that do not depend on the Fisher information of the prior and are applicable under certain scenarios where the van Trees inequality and Efroimovich’s inequality cannot be applied. We illustrate a procedure to establish lower bounds on risk under general loss functions, and apply it under several statistical settings, including the Generalized Linear Model and a general pairwise comparison framework.},
}

EndNote citation:

%0 Thesis
%A Lee, Kuan-Yun 
%T New Information Inequalities with Applications to Statistics
%I EECS Department, University of California, Berkeley
%D 2022
%8 May 4
%@ UCB/EECS-2022-37
%U http://www2.eecs.berkeley.edu/Pubs/TechRpts/2022/EECS-2022-37.html
%F Lee:EECS-2022-37