The previous discussion only estimates the first non-Gaussian
vector. One way to compute the other higher order vectors
is following what stochastic gradient ascent (SGA) does:
start with a set of orthonormalized vectors, update them using
the suggested iteration step, and recover the orthogonality
using Gram-Schmidt orthonormalization (GSO). For realtime
online computation, avoiding time-consuming GSO is
needed. Further, the non-Gaussian vectors should be orthogonal
to each other in order to ensure the independency. So,
it helps to generate “observations” only in a complementary
space for the computation of the higher order eigenvectors.
For example, to compute the second order non-Gaussian
vector, first the data is subtracted from its projection on the
estimated first-order eigenvector v1(n), as shown in:
The previous discussion only estimates the first non-Gaussianvector. One way to compute the other higher order vectorsis following what stochastic gradient ascent (SGA) does:start with a set of orthonormalized vectors, update them usingthe suggested iteration step, and recover the orthogonalityusing Gram-Schmidt orthonormalization (GSO). For realtimeonline computation, avoiding time-consuming GSO isneeded. Further, the non-Gaussian vectors should be orthogonalto each other in order to ensure the independency. So,it helps to generate “observations” only in a complementaryspace for the computation of the higher order eigenvectors.For example, to compute the second order non-Gaussianvector, first the data is subtracted from its projection on theestimated first-order eigenvector v1(n), as shown in:
การแปล กรุณารอสักครู่..
