Narrative Generation Using Learned Event Representations

Brenton Chu

EECS Department
University of California, Berkeley
Technical Report No. UCB/EECS-2019-90
May 21, 2019

http://www2.eecs.berkeley.edu/Pubs/TechRpts/2019/EECS-2019-90.pdf

In this paper, I propose a model for writing stories by utilizing learned event representations to guide the construction of future events and, subsequently, sentences associated with the future events. While using event representations composed of tuples taking specific elements from a dependency parse would result in information being lost in translating between sentence and event representation, allowing the model to learn its own event representations guided by the existing event representation tuples allows for the retainment of information relevant to the production of subsequent sentences. The model beats the baseline results and the models from Martin et al. on perplexity for generating sentences, as well as on most of the top-5 accuracy scores. On human evaluation, my model produces significantly better out- put than Martin et al.’s model, with marginal improvement over a seq2seq model.

Advisor: Marti Hearst


BibTeX citation:

@mastersthesis{Chu:EECS-2019-90,
    Author = {Chu, Brenton},
    Title = { Narrative Generation Using Learned Event Representations},
    School = {EECS Department, University of California, Berkeley},
    Year = {2019},
    Month = {May},
    URL = {http://www2.eecs.berkeley.edu/Pubs/TechRpts/2019/EECS-2019-90.html},
    Number = {UCB/EECS-2019-90},
    Abstract = {In this paper, I propose a model for writing stories by utilizing learned event representations to guide the construction of future events and, subsequently, sentences associated with the future events. While using event representations composed of tuples taking specific elements from a dependency parse would result in information being lost in translating between sentence and event representation, allowing the model to learn its own event representations guided by the existing event representation tuples allows for the retainment of information relevant to the production of subsequent sentences. The model beats the baseline results and the models from Martin et al. on perplexity for generating sentences, as well as on most of the top-5 accuracy scores. On human evaluation, my model produces significantly better out- put than Martin et al.’s model, with marginal improvement over a seq2seq model.}
}

EndNote citation:

%0 Thesis
%A Chu, Brenton
%T  Narrative Generation Using Learned Event Representations
%I EECS Department, University of California, Berkeley
%D 2019
%8 May 21
%@ UCB/EECS-2019-90
%U http://www2.eecs.berkeley.edu/Pubs/TechRpts/2019/EECS-2019-90.html
%F Chu:EECS-2019-90