Daniel Li

EECS Department, University of California, Berkeley

Technical Report No. UCB/EECS-2018-33

May 8, 2018

http://www2.eecs.berkeley.edu/Pubs/TechRpts/2018/EECS-2018-33.pdf

We present Adaptive Memory Networks (AMN) that process input-question pairs to dynamically construct a network architecture optimized for lower inference times. AMN creates multiple memory banks to store entities from the input story to answer the questions. The model learns to reason important entities from the input text based on the question and concentrates these entities within a single memory bank. At inference, one or few banks are used, creating a tradeoff between accuracy and performance. AMN is enabled by first, a novel bank controller that makes discrete decisions with high accuracy and second, the capabilities of dynamic frameworks (such as PyTorch) that allow for dynamic network sizing and efficient variable mini-batching. In our results, we demonstrate that our model learns to construct a varying number of memory banks based on task complexity and achieves faster inference times for standard bAbI tasks, and modified bAbI tasks. We solve all bAbI tasks with an average of 48% fewer entities on tasks containing excess, unrelated information.

Advisors: Satish Rao


BibTeX citation:

@mastersthesis{Li:EECS-2018-33,
    Author= {Li, Daniel},
    Title= {Adaptive Memory Networks},
    School= {EECS Department, University of California, Berkeley},
    Year= {2018},
    Month= {May},
    Url= {http://www2.eecs.berkeley.edu/Pubs/TechRpts/2018/EECS-2018-33.html},
    Number= {UCB/EECS-2018-33},
    Abstract= {We present Adaptive Memory Networks (AMN) that process input-question pairs to dynamically construct a network architecture optimized for lower inference times. AMN creates multiple memory banks to store entities from the input story to answer the questions. The model learns to reason important entities from the input text based on the question and concentrates these entities within a single memory bank. At inference, one or few banks are used, creating a tradeoff between accuracy and performance. AMN is enabled by first, a novel bank controller that makes discrete decisions with high accuracy and second,  the capabilities of dynamic frameworks (such as PyTorch) that allow for dynamic network sizing and efficient variable mini-batching.  In our results, we demonstrate that our model learns to construct a varying number of memory banks based on task complexity and achieves faster inference times for standard bAbI tasks, and modified bAbI tasks.  We solve all bAbI tasks with an average of 48% fewer entities on tasks containing excess, unrelated information.},
}

EndNote citation:

%0 Thesis
%A Li, Daniel 
%T Adaptive Memory Networks
%I EECS Department, University of California, Berkeley
%D 2018
%8 May 8
%@ UCB/EECS-2018-33
%U http://www2.eecs.berkeley.edu/Pubs/TechRpts/2018/EECS-2018-33.html
%F Li:EECS-2018-33