Allen Yang and Roozbeh Jafari and Philip Kuryloski and Sameer Iyengar and S. Shankar Sastry and Ruzena Bajcsy

EECS Department, University of California, Berkeley

Technical Report No. UCB/EECS-2007-143

December 6, 2007

http://www2.eecs.berkeley.edu/Pubs/TechRpts/2007/EECS-2007-143.pdf

We propose a distributed recognition framework to classify human actions using a wearable motion sensor network. Each sensor node consists of an integrated triaxial accelerometer and biaxial gyroscope. Given a set of pre-segmented actions as training examples, the algorithm simultaneously segments and classifies human actions from a motion sequence, and it also rejects unknown actions that are not in the training set. The classification is distributedly operated on individual sensor nodes and a base station computer. Due to rapid advances in the integration of mobile processors and heterogeneous sensors, a distributed recognition system likely outperforms traditional centralized recognition methods. In this paper, we assume the distribution of multiple action classes satisfies a mixture subspace model, one subspace for each action class. Given a new test sample, we seek the sparsest linear representation of the sample w.r.t. all training examples. We show that the dominant coefficients in the representation only correspond to the action class of the test sample, and hence its membership is encoded in the representation. We provide fast linear solvers to compute such representation via L-1 minimization.


BibTeX citation:

@techreport{Yang:EECS-2007-143,
    Author= {Yang, Allen and Jafari, Roozbeh and Kuryloski, Philip and Iyengar, Sameer and Sastry, S. Shankar and Bajcsy, Ruzena},
    Title= {Distributed Segmentation and Classification of Human Actions Using a Wearable Motion Sensor Network},
    Year= {2007},
    Month= {Dec},
    Url= {http://www2.eecs.berkeley.edu/Pubs/TechRpts/2007/EECS-2007-143.html},
    Number= {UCB/EECS-2007-143},
    Abstract= {We propose a distributed recognition framework to classify human actions using a wearable motion sensor network. Each sensor node consists of an integrated triaxial accelerometer and biaxial gyroscope. Given a set of pre-segmented actions as training examples, the algorithm simultaneously segments and classifies human actions from a motion sequence, and it also rejects unknown actions that are not in the training set. The classification is distributedly operated on individual sensor nodes and a base station computer. Due to rapid advances in the integration of mobile processors and heterogeneous sensors, a distributed recognition system likely outperforms traditional centralized recognition methods. In this paper, we assume the distribution of multiple action classes satisfies a mixture subspace model, one subspace for each action class. Given a new test sample, we seek the sparsest linear representation of the sample w.r.t. all training examples. We show that the dominant coefficients in the representation only correspond to the action class of the test sample, and hence its membership is encoded in the representation. We provide fast linear solvers to compute such representation via L-1 minimization.},
}

EndNote citation:

%0 Report
%A Yang, Allen 
%A Jafari, Roozbeh 
%A Kuryloski, Philip 
%A Iyengar, Sameer 
%A Sastry, S. Shankar 
%A Bajcsy, Ruzena 
%T Distributed Segmentation and Classification of Human Actions Using a Wearable Motion Sensor Network
%I EECS Department, University of California, Berkeley
%D 2007
%8 December 6
%@ UCB/EECS-2007-143
%U http://www2.eecs.berkeley.edu/Pubs/TechRpts/2007/EECS-2007-143.html
%F Yang:EECS-2007-143