The previous discussion only estimates the first non-Gaussian
vector. One way to compute the other higher order vectors
is following what stochastic gradient ascent (SGA) does:
start with a set of orthonormalized vectors, update them using
the suggested iteration step, and recover the orthogonality
using Gram-Schmidt orthonormalization (GSO). For realtime
online computation, avoiding time-consuming GSO is
needed. Further, the non-Gaussian vectors should be orthogonal
to each other in order to ensure the independency. So,
it helps to generate “observations” only in a complementary
space for the computation of the higher order eigenvectors.
For example, to compute the second order non-Gaussian
vector, first the data is subtracted from its projection on the
estimated first-order eigenvector v1(n), as shown in: