Modeling Cloth from Examples
Ryan White
EECS Department, University of California, Berkeley
Technical Report No. UCB/EECS-2007-134
November 8, 2007
http://www2.eecs.berkeley.edu/Pubs/TechRpts/2007/EECS-2007-134.pdf
We measure cloth properties and create cloth animations from video of actual cloth. This work spans several domains: texture tracking and replacement, 3D shape estimation from a single view, 3D shape from multiple views and data driven cloth animation.
In the first part of the thesis, the emphasis is on single view estimation of visual properties and the related problem of texture replacement. We show that \textit{screen print} cloth has advantageous properties and that simple lighting estimation makes texture replacement and geometry estimation substantially better.
In the second part of the thesis, we create cloth animations in a data-driven manner by recording real cloth movement. Our method involves printing custom clothing, capturing video of the clothing from multiple viewpoints and then building models of the cloth motion as frame-by-frame geometry. This problem is difficult because of occlusion: often some portion of the cloth makes it impossible to see other regions of the cloth. To overcome this challenge, we print a multi-colored pattern on the surface of the cloth to simplify identification of the surface and we use a data-driven hole filling technique to sew in observations of missing data from other frames. The resulting data can be used in several applications: we show examples of editing and modifying the data for different animation purposes.
Advisors: Jitendra Malik and David Forsyth
BibTeX citation:
@phdthesis{White:EECS-2007-134, Author= {White, Ryan}, Title= {Modeling Cloth from Examples}, School= {EECS Department, University of California, Berkeley}, Year= {2007}, Month= {Nov}, Url= {http://www2.eecs.berkeley.edu/Pubs/TechRpts/2007/EECS-2007-134.html}, Number= {UCB/EECS-2007-134}, Abstract= {We measure cloth properties and create cloth animations from video of actual cloth. This work spans several domains: texture tracking and replacement, 3D shape estimation from a single view, 3D shape from multiple views and data driven cloth animation. In the first part of the thesis, the emphasis is on single view estimation of visual properties and the related problem of texture replacement. We show that \textit{screen print} cloth has advantageous properties and that simple lighting estimation makes texture replacement and geometry estimation substantially better. In the second part of the thesis, we create cloth animations in a data-driven manner by recording real cloth movement. Our method involves printing custom clothing, capturing video of the clothing from multiple viewpoints and then building models of the cloth motion as frame-by-frame geometry. This problem is difficult because of occlusion: often some portion of the cloth makes it impossible to see other regions of the cloth. To overcome this challenge, we print a multi-colored pattern on the surface of the cloth to simplify identification of the surface and we use a data-driven hole filling technique to sew in observations of missing data from other frames. The resulting data can be used in several applications: we show examples of editing and modifying the data for different animation purposes.}, }
EndNote citation:
%0 Thesis %A White, Ryan %T Modeling Cloth from Examples %I EECS Department, University of California, Berkeley %D 2007 %8 November 8 %@ UCB/EECS-2007-134 %U http://www2.eecs.berkeley.edu/Pubs/TechRpts/2007/EECS-2007-134.html %F White:EECS-2007-134