Constructing Taxonomies from Pretrained Language Models
Catherine Chen and Kevin Lin and Daniel Klein
EECS Department, University of California, Berkeley
Technical Report No. UCB/EECS-2024-141
May 31, 2024
http://www2.eecs.berkeley.edu/Pubs/TechRpts/2024/EECS-2024-141.pdf
We present a method for constructing taxonomic trees (e.g., WORDNET) using pretrained language models. Our approach is composed of two modules, one that predicts parenthood relations and another that reconciles those predictions into trees. The parenthood prediction module produces likelihood scores for each potential parent-child pair, creating a graph of parent-child relation scores. The tree reconciliation module treats the task as a graph optimization problem and outputs the maximum spanning tree of this graph. We train our model on subtrees sampled from WORDNET, and test on nonoverlapping WORDNET subtrees. We show that incorporating web-retrieved glosses can further improve performance. On the task of constructing subtrees of English WORDNET, the model achieves 66.7 ancestor F1, a 20.0% relative increase over the previous best published result on this task. In addition, we convert the original English dataset into nine other languages using OPEN MULTILINGUAL WORDNET and extend our results across these languages.
Advisors: Daniel Klein and Jack Gallant
BibTeX citation:
@mastersthesis{Chen:EECS-2024-141, Author= {Chen, Catherine and Lin, Kevin and Klein, Daniel}, Title= {Constructing Taxonomies from Pretrained Language Models}, School= {EECS Department, University of California, Berkeley}, Year= {2024}, Month= {May}, Url= {http://www2.eecs.berkeley.edu/Pubs/TechRpts/2024/EECS-2024-141.html}, Number= {UCB/EECS-2024-141}, Abstract= {We present a method for constructing taxonomic trees (e.g., WORDNET) using pretrained language models. Our approach is composed of two modules, one that predicts parenthood relations and another that reconciles those predictions into trees. The parenthood prediction module produces likelihood scores for each potential parent-child pair, creating a graph of parent-child relation scores. The tree reconciliation module treats the task as a graph optimization problem and outputs the maximum spanning tree of this graph. We train our model on subtrees sampled from WORDNET, and test on nonoverlapping WORDNET subtrees. We show that incorporating web-retrieved glosses can further improve performance. On the task of constructing subtrees of English WORDNET, the model achieves 66.7 ancestor F1, a 20.0% relative increase over the previous best published result on this task. In addition, we convert the original English dataset into nine other languages using OPEN MULTILINGUAL WORDNET and extend our results across these languages.}, }
EndNote citation:
%0 Thesis %A Chen, Catherine %A Lin, Kevin %A Klein, Daniel %T Constructing Taxonomies from Pretrained Language Models %I EECS Department, University of California, Berkeley %D 2024 %8 May 31 %@ UCB/EECS-2024-141 %U http://www2.eecs.berkeley.edu/Pubs/TechRpts/2024/EECS-2024-141.html %F Chen:EECS-2024-141