Interpretable Few-Shot Image Classification with Neural-Backed Decision Trees
Scott Lee
EECS Department, University of California, Berkeley
Technical Report No. UCB/EECS-2020-71
May 27, 2020
http://www2.eecs.berkeley.edu/Pubs/TechRpts/2020/EECS-2020-71.pdf
Recent advances in deep learning techniques, especially convolutional neural networks (CNNs), have caused an explosion in their popularity for image-based tasks. Although CNNs achieve state-of-the-art accuracy and can be applied to a variety of tasks, they are difficult to interpret, as humans often cannot explain the reasoning behind the decisions made by these “black box” models. We propose a novel method, Neural-Backed Decision Trees (NBDTs), that combines the explainable nature of decision trees with the high predictive power of neural networks, and demonstrate competitive accuracy on standard image classification tasks (CIFAR10, CIFAR100, TinyImageNet). We quantitatively and qualitatively analyze the quality of explanations for the decisions made by our method, finding that related classes are clustered with higher similarity compared lexical methods. Additionally, we extend NBDTs to few-shot image classification on the Animals with Attributes 2 dataset, maintaining high accuracy on seen classes and achieving competitive accuracy for few-shot classes.
Advisors: Joseph Gonzalez
BibTeX citation:
@mastersthesis{Lee:EECS-2020-71, Author= {Lee, Scott}, Editor= {Gonzalez, Joseph and Wright, Matthew}, Title= {Interpretable Few-Shot Image Classification with Neural-Backed Decision Trees}, School= {EECS Department, University of California, Berkeley}, Year= {2020}, Month= {May}, Url= {http://www2.eecs.berkeley.edu/Pubs/TechRpts/2020/EECS-2020-71.html}, Number= {UCB/EECS-2020-71}, Abstract= {Recent advances in deep learning techniques, especially convolutional neural networks (CNNs), have caused an explosion in their popularity for image-based tasks. Although CNNs achieve state-of-the-art accuracy and can be applied to a variety of tasks, they are difficult to interpret, as humans often cannot explain the reasoning behind the decisions made by these “black box” models. We propose a novel method, Neural-Backed Decision Trees (NBDTs), that combines the explainable nature of decision trees with the high predictive power of neural networks, and demonstrate competitive accuracy on standard image classification tasks (CIFAR10, CIFAR100, TinyImageNet). We quantitatively and qualitatively analyze the quality of explanations for the decisions made by our method, finding that related classes are clustered with higher similarity compared lexical methods. Additionally, we extend NBDTs to few-shot image classification on the Animals with Attributes 2 dataset, maintaining high accuracy on seen classes and achieving competitive accuracy for few-shot classes.}, }
EndNote citation:
%0 Thesis %A Lee, Scott %E Gonzalez, Joseph %E Wright, Matthew %T Interpretable Few-Shot Image Classification with Neural-Backed Decision Trees %I EECS Department, University of California, Berkeley %D 2020 %8 May 27 %@ UCB/EECS-2020-71 %U http://www2.eecs.berkeley.edu/Pubs/TechRpts/2020/EECS-2020-71.html %F Lee:EECS-2020-71