Multiplicative Coding and Factorization in Vector Symbolic Models of Cognition
Spencer Kent
EECS Department, University of California, Berkeley
Technical Report No. UCB/EECS-2020-215
December 18, 2020
http://www2.eecs.berkeley.edu/Pubs/TechRpts/2020/EECS-2020-215.pdf
This dissertation covers my attempts to confront the challenge and promise of multiplicative representations, and their attendant factorization problems, in the brain. This is grounded in a paradigm for modeling cognition that defines an algebra over high-dimensional vectors and presents a compelling factorization problem. The proposed solution to this problem, a recurrent neural network architecture called Resonator Networks, has several interesting properties that make it uniquely effective on this problem and may provide some principles for designing a new class of neural network models. I show some applications of multiplicative distributed codes for representing visual scenes and suggest how such representations may be a useful tool for unifying symbolic and connectionist theories of intelligence.
Advisors: Bruno Olshausen and Alexei (Alyosha) Efros
BibTeX citation:
@phdthesis{Kent:EECS-2020-215, Author= {Kent, Spencer}, Title= {Multiplicative Coding and Factorization in Vector Symbolic Models of Cognition}, School= {EECS Department, University of California, Berkeley}, Year= {2020}, Month= {Dec}, Url= {http://www2.eecs.berkeley.edu/Pubs/TechRpts/2020/EECS-2020-215.html}, Number= {UCB/EECS-2020-215}, Abstract= {This dissertation covers my attempts to confront the challenge and promise of multiplicative representations, and their attendant factorization problems, in the brain. This is grounded in a paradigm for modeling cognition that defines an algebra over high-dimensional vectors and presents a compelling factorization problem. The proposed solution to this problem, a recurrent neural network architecture called Resonator Networks, has several interesting properties that make it uniquely effective on this problem and may provide some principles for designing a new class of neural network models. I show some applications of multiplicative distributed codes for representing visual scenes and suggest how such representations may be a useful tool for unifying symbolic and connectionist theories of intelligence.}, }
EndNote citation:
%0 Thesis %A Kent, Spencer %T Multiplicative Coding and Factorization in Vector Symbolic Models of Cognition %I EECS Department, University of California, Berkeley %D 2020 %8 December 18 %@ UCB/EECS-2020-215 %U http://www2.eecs.berkeley.edu/Pubs/TechRpts/2020/EECS-2020-215.html %F Kent:EECS-2020-215