Human Eye Motions in Virtual Characters
The project involves investigating human eye movement in representation of virtual characters. The goal is to improve realism among virtual humans by using actual human eyes to model on an animated figure. Information collected about the motion of human eyes will be displayed in a new model to automatically synthesize eye rotation. The results of this research can contribute to future representations of virtual actors or avatars in animated games and films.
Eye motions are recored using the Dikablis eye-tracker, a head-mounted unit with binocular cameras that point to each eye. When joined with the motion capture system it can be used to obtain people's head movement and eye gaze.
My job is to determine which method (procedural or data-driven animation) works best to connect eye motion to a character. To do so, a simple set of scenarios must be captured and utilized to implement each approach separately. Following implementation, the results will be evaluated based on effectiveness of the characters.
Eye motions are recored using the Dikablis eye-tracker, a head-mounted unit with binocular cameras that point to each eye. When joined with the motion capture system it can be used to obtain people's head movement and eye gaze.
My job is to determine which method (procedural or data-driven animation) works best to connect eye motion to a character. To do so, a simple set of scenarios must be captured and utilized to implement each approach separately. Following implementation, the results will be evaluated based on effectiveness of the characters.