[Oshita Lab.][Research Theme] [Japanese]

Real-time Character Motion Control Using Data Gloves

For computer games, communications using avatars, and real-time animation systems, users want to move a character freely in a virtual world. However, the flexibility of the current motion control interface is very limited because currently character motion is simply controlled with pre-defined motion data. In this paper, we present a motion control method that uses two data gloves as an input device, making a virtual character perform various motions. Each part of the character's body is controlled using input from data gloves. For example, a user can control the character's left arm and left leg using their left hand. However, there are limited degrees of freedom using data gloves to control all the character's body parts directly. Moreover, it is difficult for users to perform complex motions such as stepping or jumping because multiple body parts have to be controlled at the same time. In order to solve these problems, we introduce three novel ideas. First, we change the mapping between the user's hand and the character's body parts dynamically. For example, when the both hands are moving in the same direction, they are used to control the pelvis instead of arms or legs. Then, we introduce a manual switch between arms and legs. A hand is used for controlling the arms or the legs by switching the mode manually with a finger. Second, we use mechanisms of real puppets to control multiple body parts synchronously. Third, we combine the motion data based control with the position based control. When it is needed, a motion is selected from the pre-defined motion data, and is then executed. We use this approach for locomotive motion such as rightward step, leftward step, backward step, and jump. A motion is selected when the character's waist is moved more than certain distance in a given direction. Using this approach, complex motions can be performed easily.


e-mail address