Computer Graphics Laboratory ETH Zurich

ETH

PRECISION: Precomputing Environment Semantics for Contact-rich Character Animation

M. Kapadia, X. Xianghao, M. Nitti, M. Kallmann, S. Coros, R. W. Sumner, M. Gross

i3D '16 Proceedings of the ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games (Redmond, WA, USA, February 27-28, 2016), pp. 29-37

Abstract

The widespread availability of high-quality motion capture data and the maturity of solutions to animate virtual characters has paved the way for the next generation of interactive virtual worlds exhibiting intricate interactions between characters and the environments they inhabit. However, current motion synthesis techniques have not been designed to scale with complex environments and contact-rich motions, requiring environment designers to manually embed motion semantics in the environment geometry in order to address on-line motion synthesis. This paper presents an automated approach for analyzing both motions and environments in order to represent the different ways in which an environment can afford a character to move. We extract the salient features that characterize the contact-rich motion repertoire of a character and detect valid transitions in the environment where each of these motions may be possible, along with additional semantics that inform which surfaces of the environment the character may use for support during the motion. The precomputed motion semantics can be easily integrated into standard navigation and animation pipelines in order to greatly enhance the motion capabilities of virtual characters. The computational efficiency of our approach enables two additional applications. Environment designers can interactively design new environments and get instant feedback on how characters may potentially interact, which can be used for iterative modeling and refinement. End users can dynamically edit virtual worlds and characters will automatically accommodate the changes in the environment in their movement strategies.


Downloads

Download Paper
[PDF]
Download Video
[Video]
Download Paper
[BibTeX]