Cloth Capture

Ryan White, Anthony Lobay and D. A. Forsyth

EECS Department
University of California, Berkeley
Technical Report No. UCB/CSD-05-1387
May 2005

http://www2.eecs.berkeley.edu/Pubs/TechRpts/2005/CSD-05-1387.pdf

We present a method for capturing the geometry and parameterization of fast-moving cloth using multiple video cameras, without requiring camera calibration. Our cloth is printed with a multiscale pattern that allows capture at both high speed and high spatial resolution even though self-occlusion might block any individual camera from seeing the majority of the cloth. We show how to incorporate knowledge of this pattern into conventional structure from motion approaches, and use a novel scheme for camera calibration using the pattern, derived from the shape from texture literature. By combining strain minimization with the point reconstruction we produce visually appealing cloth sequences. We demonstrate our algorithm by capturing, retexturing and displaying several sequences of fast moving cloth.


BibTeX citation:

@techreport{White:CSD-05-1387,
    Author = {White, Ryan and Lobay, Anthony and Forsyth, D. A.},
    Title = {Cloth Capture},
    Institution = {EECS Department, University of California, Berkeley},
    Year = {2005},
    Month = {May},
    URL = {http://www2.eecs.berkeley.edu/Pubs/TechRpts/2005/5655.html},
    Number = {UCB/CSD-05-1387},
    Abstract = {We present a method for capturing the geometry and parameterization of fast-moving cloth using multiple video cameras, without requiring camera calibration. Our cloth is printed with a multiscale pattern that allows capture at both high speed and high spatial resolution even though self-occlusion might block any individual camera from seeing the majority of the cloth. We show how to incorporate knowledge of this pattern into conventional structure from motion approaches, and use a novel scheme for camera calibration using the pattern, derived from the shape from texture literature. By combining strain minimization with the point reconstruction we produce visually appealing cloth sequences. We demonstrate our algorithm by capturing, retexturing and displaying several sequences of fast moving cloth.}
}

EndNote citation:

%0 Report
%A White, Ryan
%A Lobay, Anthony
%A Forsyth, D. A.
%T Cloth Capture
%I EECS Department, University of California, Berkeley
%D 2005
%@ UCB/CSD-05-1387
%U http://www2.eecs.berkeley.edu/Pubs/TechRpts/2005/5655.html
%F White:CSD-05-1387