Daniel Sun

EECS Department, University of California, Berkeley

Technical Report No. UCB/EECS-2022-118

May 13, 2022

http://www2.eecs.berkeley.edu/Pubs/TechRpts/2022/EECS-2022-118.pdf

Low-power machine learning techniques on hardware are increasingly vital and important for a wide range of applications ranging from wearables such as smartwatches to devices on the edge. Hyperdimensional Computing (HDC) is one proposed algorithm that has been consistently proven to demonstrate high accuracy with minimal power consumption, across diverse classification tasks such as gesture recognition [20] or speech recognition [12]. HDC processors have been implemented in the past, however their energy-efficiency has largely been limited by the costly hypervector memory storage, which grows linearly with the number of input features or sensors. In this work, we propose a novel method of combining HDC Sensor Fusion with CA rule 90 in conjunction with vector folding, to reduce the memory requirement of this processor to near 0, reaching an energy efficiency of 39.1 nJ/prediction; a 4.9x energy efficiency improvement, 9.5x per channel, over the state-of-the-art HDC processor. This processor is also compared with an analogous SVM implementation, demonstrating a 9.5x energy efficiency improvement over SVM when scaling to a large number of 214 channels. The energy-efficiency and scalability of HDC is testament to its applicability on a broad range of low-power machine learning tasks today. As such, exploration of HDC on other classification tasks such as TinyML Keyword Spotting is also performed, as part of an ongoing effort to pave the way for HDC to become the paradigm of choice for high-accuracy, low-power, and real-time classification tasks.

Advisors: Jan M. Rabaey


BibTeX citation:

@mastersthesis{Sun:EECS-2022-118,
    Author= {Sun, Daniel},
    Title= {Low-Power Hyperdimensional Computing Processors for Real-Time Wearable Sensor Fusion and Keyword Classification},
    School= {EECS Department, University of California, Berkeley},
    Year= {2022},
    Month= {May},
    Url= {http://www2.eecs.berkeley.edu/Pubs/TechRpts/2022/EECS-2022-118.html},
    Number= {UCB/EECS-2022-118},
    Abstract= {Low-power machine learning techniques on hardware are increasingly vital and important for a wide range of applications ranging from wearables such as smartwatches to devices on the edge. Hyperdimensional Computing (HDC) is one proposed algorithm that has been consistently proven to demonstrate high accuracy with minimal power consumption, across diverse classification tasks such as gesture recognition [20] or speech recognition [12]. HDC processors have been implemented in the past, however their energy-efficiency has largely been limited by the costly hypervector memory storage, which grows linearly with the number of input features or sensors. In this work, we propose a novel method of combining HDC Sensor Fusion with CA rule 90 in conjunction with vector folding, to reduce the memory requirement of this processor to near 0, reaching an energy efficiency of 39.1 nJ/prediction; a 4.9x energy efficiency improvement, 9.5x per channel, over the state-of-the-art HDC processor. This processor is also compared with an analogous SVM implementation, demonstrating a 9.5x energy efficiency improvement over SVM when scaling to a large number of 214 channels. The energy-efficiency and scalability of HDC is testament to its applicability on a
broad range of low-power machine learning tasks today. As such, exploration of HDC on other classification tasks such as TinyML Keyword Spotting is also performed, as part of an
ongoing effort to pave the way for HDC to become the paradigm of choice for high-accuracy, low-power, and real-time classification tasks.},
}

EndNote citation:

%0 Thesis
%A Sun, Daniel 
%T Low-Power Hyperdimensional Computing Processors for Real-Time Wearable Sensor Fusion and Keyword Classification
%I EECS Department, University of California, Berkeley
%D 2022
%8 May 13
%@ UCB/EECS-2022-118
%U http://www2.eecs.berkeley.edu/Pubs/TechRpts/2022/EECS-2022-118.html
%F Sun:EECS-2022-118