Improving the Efficiency of Robust Generative Classifiers
Alan Rosenthal
EECS Department, University of California, Berkeley
Technical Report No. UCB/EECS-2021-68
May 13, 2021
http://www2.eecs.berkeley.edu/Pubs/TechRpts/2021/EECS-2021-68.pdf
The phenomenon of adversarial examples in neural networks has spurred the development of robust classification methods that are immune to these vulnerabilities. Classifiers using generative models, including Analysis by Synthesis (ABS) introduced by Schott et al. and its extension E-ABS by Ju et al., have achieved state-of-the-art robust accuracy on several benchmark datasets like SVHN and MNIST. Their inference time complexity, however, scales linearly with the number of classes in the data, limiting their practicality. We evaluate two approaches to speed up ABS-style models and inference: first, a hierarchical decision tree framework that achieves accuracy nearly on par with E-ABS in logarithmic time, and second, a scheme to classify latent vectors based on a prior distribution in constant time. We also provide an algorithm to search over decision tree structures, which yields significant improvements in accuracy over naive arrangements.
Advisors: David A. Wagner
BibTeX citation:
@mastersthesis{Rosenthal:EECS-2021-68, Author= {Rosenthal, Alan}, Title= {Improving the Efficiency of Robust Generative Classifiers}, School= {EECS Department, University of California, Berkeley}, Year= {2021}, Month= {May}, Url= {http://www2.eecs.berkeley.edu/Pubs/TechRpts/2021/EECS-2021-68.html}, Number= {UCB/EECS-2021-68}, Abstract= {The phenomenon of adversarial examples in neural networks has spurred the development of robust classification methods that are immune to these vulnerabilities. Classifiers using generative models, including Analysis by Synthesis (ABS) introduced by Schott et al. and its extension E-ABS by Ju et al., have achieved state-of-the-art robust accuracy on several benchmark datasets like SVHN and MNIST. Their inference time complexity, however, scales linearly with the number of classes in the data, limiting their practicality. We evaluate two approaches to speed up ABS-style models and inference: first, a hierarchical decision tree framework that achieves accuracy nearly on par with E-ABS in logarithmic time, and second, a scheme to classify latent vectors based on a prior distribution in constant time. We also provide an algorithm to search over decision tree structures, which yields significant improvements in accuracy over naive arrangements.}, }
EndNote citation:
%0 Thesis %A Rosenthal, Alan %T Improving the Efficiency of Robust Generative Classifiers %I EECS Department, University of California, Berkeley %D 2021 %8 May 13 %@ UCB/EECS-2021-68 %U http://www2.eecs.berkeley.edu/Pubs/TechRpts/2021/EECS-2021-68.html %F Rosenthal:EECS-2021-68