Exploratory and Explanatory Tools for ML Application Development

Eldon Schoop

EECS Department
University of California, Berkeley
Technical Report No. UCB/EECS-2022-234
November 15, 2022

http://www2.eecs.berkeley.edu/Pubs/TechRpts/2022/EECS-2022-234.pdf

While Deep Learning ("DL") techniques have enabled groundbreaking advances in many domains, non-expert DL users encounter significant usability challenges when attempting to develop, debug, and interpret DL applications. This work describes how techniques from program analysis and DL interpretability are drawn upon to build novel, interactive tools that support users in important stages of DL development. Key interactions of these tools facilitate pattern discovery through exploration and provide explanations that reveal underlying structure. At early stages, Acumen helps users find suitable templates to start their DL projects through exploring and annotating an interactive visualization of code embeddings and extracted attributes. Umlaut helps users find and fix silent errors in DL programs during model training with an interface unifying visualizations, code, and error explanations. IMACS helps users explore and compare influential concepts extracted from image classification models during model evaluation. User studies reveal how these systems address usability gaps at different stages of the DL development process, as well as how these interaction techniques can generalize to other scenarios.

Advisor: Björn Hartmann


BibTeX citation:

@phdthesis{Schoop:EECS-2022-234,
    Author = {Schoop, Eldon},
    Title = {Exploratory and Explanatory Tools for ML Application Development},
    School = {EECS Department, University of California, Berkeley},
    Year = {2022},
    Month = {Nov},
    URL = {http://www2.eecs.berkeley.edu/Pubs/TechRpts/2022/EECS-2022-234.html},
    Number = {UCB/EECS-2022-234},
    Abstract = {While Deep Learning ("DL") techniques have enabled groundbreaking advances in many domains, non-expert DL users encounter significant usability challenges when attempting to develop, debug, and interpret DL applications. This work describes how techniques from program analysis and DL interpretability are drawn upon to build novel, interactive tools that support users in important stages of DL development. Key interactions of these tools facilitate pattern discovery through exploration and provide explanations that reveal underlying structure. At early stages, Acumen helps users find suitable templates to start their DL projects through exploring and annotating an interactive visualization of code embeddings and extracted attributes. Umlaut helps users find and fix silent errors in DL programs during model training with an interface unifying visualizations, code, and error explanations. IMACS helps users explore and compare influential concepts extracted from image classification models during model evaluation. User studies reveal how these systems address usability gaps at different stages of the DL development process, as well as how these interaction techniques can generalize to other scenarios.}
}

EndNote citation:

%0 Thesis
%A Schoop, Eldon
%T Exploratory and Explanatory Tools for ML Application Development
%I EECS Department, University of California, Berkeley
%D 2022
%8 November 15
%@ UCB/EECS-2022-234
%U http://www2.eecs.berkeley.edu/Pubs/TechRpts/2022/EECS-2022-234.html
%F Schoop:EECS-2022-234