Kevin Lin
EECS Department
University of California, Berkeley
Technical Report No. UCB/EECS-2024-218
December 18, 2024
http://www2.eecs.berkeley.edu/Pubs/TechRpts/2024/EECS-2024-218.pdf
Large language models (LLMs) are as capable as the context that they are given. This dissertation studies how to structure context, improving LLMs by orchestrating and engineering the contexts that they are given. First, we present context decomposition, a technique for breaking complex contexts into simpler contexts that specialized models are more capable of handling. Second, we show in a comparative study that, context rewriting, a method for re-representing conversational utterances into simpler contexts improves data labeling efficiency and modularity. Finally, we present context tuning, a technique to finetune LLMs to better handle input contexts.
Advisor: Daniel Klein and Joseph Gonzalez
"; ?>
BibTeX citation:
@phdthesis{Lin:EECS-2024-218, Author = {Lin, Kevin}, Title = {Structured Contexts For Large Language Models}, School = {EECS Department, University of California, Berkeley}, Year = {2024}, Month = {Dec}, URL = {http://www2.eecs.berkeley.edu/Pubs/TechRpts/2024/EECS-2024-218.html}, Number = {UCB/EECS-2024-218}, Abstract = {Large language models (LLMs) are as capable as the context that they are given. This dissertation studies how to structure context, improving LLMs by orchestrating and engineering the contexts that they are given. First, we present context decomposition, a technique for breaking complex contexts into simpler contexts that specialized models are more capable of handling. Second, we show in a comparative study that, context rewriting, a method for re-representing conversational utterances into simpler contexts improves data labeling efficiency and modularity. Finally, we present context tuning, a technique to finetune LLMs to better handle input contexts.} }
EndNote citation:
%0 Thesis %A Lin, Kevin %T Structured Contexts For Large Language Models %I EECS Department, University of California, Berkeley %D 2024 %8 December 18 %@ UCB/EECS-2024-218 %U http://www2.eecs.berkeley.edu/Pubs/TechRpts/2024/EECS-2024-218.html %F Lin:EECS-2024-218