The computation of PCA requires eigenvalue decomposition
(EVD) of the covariance matrix of the feature vectors.
One well-known EVD method is the Cyclic Jacobi’s
method (Golub and van Loan, 1996). The Jacobi’s method
which diagonalizes a symmetric matrix requires around
O(d3 + d2n) computations (Golub and van Loan, 1996)
(where n is the number of feature vectors or samples used).
This computation in many applications (e.g. fixed-point
implementation) is undesirable. A number of methods have
been proposed in the literature for computing the PCA
transform with reduced computational complexity. Reddy
and Herron (2001) proposed modification to Jacobi’s
method which favours fixed-point implementations. Their
computational complexity is, however, still of order O(d3)
for each symmetric rotation (assuming symmetric matrix
of size d · d is previously computed). Basically the
improvement in computational complexity is negligible.
Roweis (1997) proposed expectation maximizing (EM)
algorithm for PCA, which is computationally effective than
EVD method for PCA. But this technique uses EM algorithm
which could be expensive in time. It also requires
matrix inverse1 computation in both the E-step and M-step
for each of the iteration, which is an expensive exercise.
Furthermore, EM algorithm does not converge to a global
maximum; it achieves only a local maximum and thus the
choice of the initial guess used in the algorithm becomes
crucial. The power method (Schilling and Harris, 2000) is
also used to find leading eigenvector which is less expensive
method but can compute only one most leading eigenvector.
Another method is snap-shot algorithm (Sirovich,
1987) which does not explicitly compute sample covariance
matrix, however, requires matrix inversion and is based on
the assumption that the eigenvectors being searched are linear
combinations of feature vectors.