Data Driven Cloth Animation


sketch

Data Driven Cloth Animation

We present a new method for cloth animation based on data driven synthesis. In contrast to approaches that focus on physical simulation, we animate cloth by manipulating short sequences of existing cloth animation. While our source of data is cloth animation captured using video cameras [White et al. 2007], the method is equally applicable to simulation data. The approach has benefits in both cases: current cloth capture is limited because small tweaks to the data require filming an entirely new sequence. Likewise, simulation suffers from long computation times and complications such as tangling. In this sketch we create new animations by fitting cloth animation to human motion capture data, i.e., we drive the cloth with a skeleton.

Ryan White, Keenan Crane, David Forsyth, "Data Driven Cloth Animation," SIGGRAPH Technical Sketch, 2007.


Videos




sketch video [17 MB DIVX AVI]

Back to research page or publications