Leslie Kanani Michiko Ikemoto and Okan Arikan and David Forsyth

EECS Department, University of California, Berkeley

Technical Report No. UCB/EECS-2006-14

February 13, 2006

http://www2.eecs.berkeley.edu/Pubs/TechRpts/2006/EECS-2006-14.pdf

We describe a method for responsive, high-quality synthesis of human motion. Our method can quickly provide a motion synthesizer with a one-second long, high-quality transition from any frame in motion collection to any other frame in the collection.

We construct these transitions using 2-, 3- and 4-way blends. During pre-processing, we search all possible blends between representative samples of motion obtained using clustering. The blends are evaluated automatically with a novel motion evaluation procedure, which we demonstrate is significantly more accurate than current alternatives. The best blending recipe for each pair of representatives is then cached.

At run-time, we build a transition between motions by matching a future window of the source motion to a representative, matching the past of the target motion to a representative, and then applying the blend recipe recovered from the cache to source and target motion and whatever stubs are required. This method yields good-looking transitions between distinct motions with very low online cost.


BibTeX citation:

@techreport{Ikemoto:EECS-2006-14,
    Author= {Ikemoto, Leslie Kanani Michiko and Arikan, Okan and Forsyth, David},
    Title= {Quick Motion Transitions with Cached Multi-way Blends},
    Year= {2006},
    Month= {Feb},
    Url= {http://www2.eecs.berkeley.edu/Pubs/TechRpts/2006/EECS-2006-14.html},
    Number= {UCB/EECS-2006-14},
    Abstract= {We describe a method for responsive, high-quality synthesis of human motion.  Our method can quickly provide a motion synthesizer with a one-second long, high-quality transition from any frame in motion collection to any other frame in the collection.

We construct these transitions using 2-, 3- and 4-way blends.  During pre-processing, we search all possible blends between representative samples of motion obtained using clustering. The blends are evaluated automatically with a novel motion evaluation procedure, which we demonstrate is significantly more accurate than current alternatives. The best blending recipe for each pair of representatives is then cached.

At run-time, we build a transition between motions by matching a future window of the source motion to a representative, matching the past of the target motion to a representative, and then applying the blend recipe recovered from the cache to source and target motion and whatever stubs are required. This method yields good-looking transitions between distinct motions with very low online cost.},
}

EndNote citation:

%0 Report
%A Ikemoto, Leslie Kanani Michiko 
%A Arikan, Okan 
%A Forsyth, David 
%T Quick Motion Transitions with Cached Multi-way Blends
%I EECS Department, University of California, Berkeley
%D 2006
%8 February 13
%@ UCB/EECS-2006-14
%U http://www2.eecs.berkeley.edu/Pubs/TechRpts/2006/EECS-2006-14.html
%F Ikemoto:EECS-2006-14