Abstract—Computer facial animation still remains a very
challenging topic within the computer graphics community. In
this paper, a realistic and expressive computer facial animation
system is developed by automated learning from Vicon Nexus
facial motion capture data. Facial motion data of different
emotions collected using Vicon Nexus are processed using
dimensionality reduction techniques such as PCA and EMPCA.
EMPCA was found to best preserve the originality of the
data the most compared with other techniques. Ultimately,
the emotions data are mapped to a 3D animated face, which
produced results that clearly show the motion of the eyes,
eyebrows, and lips. Our approach used data captured from
a real speaker, resulting in more natural and lifelike facial
animations. This approach can be used for various applications
and serve as prototyping tool to automatically generate realistic
and expressive facial animation.