Alisha Menon

EECS Department, University of California, Berkeley

Technical Report No. UCB/EECS-2023-244

December 1, 2023

http://www2.eecs.berkeley.edu/Pubs/TechRpts/2023/EECS-2023-244.pdf

With the explosive growth of wearable devices across a wide range of medical applications, the ability to monitor a broad range of biosignals becomes increasingly viable. However, these sensors face limitations in hardware resources and battery life. Low-power in-sensor intelligence can replace costly transmission of raw data streams to improve battery life. For a neural prosthetic, for example, various biosensors can be used to intelligently map the user's intended movements into prosthetic actuation. Achieving this locally can extend the lifetime of the device and reduce latency, significantly improving the user experience. This thesis leverages the emerging brain-inspired hyperdimensional computing (HDC) paradigm, which uses an inherently simple binary representation, to address this need.

The first section of this thesis explores the energy efficiency of HDC for machine learning, including a comparison against traditional ML algorithms, through design and post-layout simulation of biosignal classification ASICs. With on-the-fly generation instead of memory storage, and vector folding, the proposed architecture achieves 39.1 nJ/prediction; a 4.9x improvement over the state-of-the-art HDC processor and 9.5x over an optimized SVM processor, paving the way for it to become the paradigm of choice for in-sensor classification.

The second section of this thesis explores the use of the paradigm for robotics including the development of a novel reactive robotics algorithm with a weighted heterogeneous sensor encoding scheme that intelligently prioritizes successful behaviors, boosting the success rate in a 2-D navigation task by over 30%, even when integrated into a neural network.

The final section of this thesis pulls together the prior elements for the realization of a user-adaptive neural prosthetic with shared control. The controller recognizes the user’s behaviors, predicts their next action based on habitual sequences, and determines prosthetic actuation through intelligent deliberation between the user's goal and sensor feedback-driven autonomy. With each layer designed for hardware-efficiency to enable in-sensor implementation, the system achieves an overall accuracy of 93%.

Advisors: Jan M. Rabaey


BibTeX citation:

@phdthesis{Menon:EECS-2023-244,
    Author= {Menon, Alisha},
    Title= {Neural Prosthetic with In-sensor Shared Control},
    School= {EECS Department, University of California, Berkeley},
    Year= {2023},
    Month= {Dec},
    Url= {http://www2.eecs.berkeley.edu/Pubs/TechRpts/2023/EECS-2023-244.html},
    Number= {UCB/EECS-2023-244},
    Abstract= {With the explosive growth of wearable devices across a wide range of medical applications, the ability to monitor a broad range of biosignals becomes increasingly viable. However, these sensors face limitations in hardware resources and battery life. Low-power in-sensor intelligence can replace costly transmission of raw data streams to improve battery life. For a neural prosthetic, for example, various biosensors can be used to intelligently map the user's intended movements into prosthetic actuation. Achieving this locally can extend the lifetime of the device and reduce latency, significantly improving the user experience. This thesis leverages the emerging brain-inspired hyperdimensional computing (HDC) paradigm, which uses an inherently simple binary representation, to address this need.

The first section of this thesis explores the energy efficiency of HDC for machine learning, including a comparison against traditional ML algorithms, through design and post-layout simulation of biosignal classification ASICs. With on-the-fly generation instead of memory storage, and vector folding, the proposed architecture achieves 39.1 nJ/prediction; a 4.9x improvement over the state-of-the-art HDC processor and 9.5x over an optimized SVM processor, paving the way for it to become the paradigm of choice for in-sensor classification. 

The second section of this thesis explores the use of the paradigm for robotics including the development of a novel reactive robotics algorithm with a weighted heterogeneous sensor encoding scheme that intelligently prioritizes successful behaviors, boosting the success rate in a 2-D navigation task by over 30%, even when integrated into a neural network. 

The final section of this thesis pulls together the prior elements for the realization of a user-adaptive neural prosthetic with shared control. The controller recognizes the user’s behaviors, predicts their next action based on habitual sequences, and determines prosthetic actuation through intelligent deliberation between the user's goal and sensor feedback-driven autonomy. With each layer designed for hardware-efficiency to enable in-sensor implementation, the system achieves an overall accuracy of 93%.},
}

EndNote citation:

%0 Thesis
%A Menon, Alisha 
%T Neural Prosthetic with In-sensor Shared Control
%I EECS Department, University of California, Berkeley
%D 2023
%8 December 1
%@ UCB/EECS-2023-244
%U http://www2.eecs.berkeley.edu/Pubs/TechRpts/2023/EECS-2023-244.html
%F Menon:EECS-2023-244