Here W = UDU−1 is the eigenvalue decomposition ofW, d is a column vector corresponding
to the diagonal elements of D, ¯V = U−1V are the input weights in the basis defined by the
eigenvectors of W, and ◦ is the Hadamard product. The variance of the input is denoted by
u2 = 1
3 , and ¯d is a column vector whose elements are d¯l = 1
1−dl
.
Our analytical solution for Wout assumes the true variance of the input. By appeal to the
central limit theorem, we can expect that if one calculates the memory curve for a given ESN
numerically, due to finite training size, the results should vary according to a normal distribution
around the analytical values. Previous attempts to characterize the memory curve of
ESN [15] used an annealed approximation over the Gaussian Orthogonal Ensemble (GOE) to
simplify the problem. Here, in contrast, we presented an exact solution.
Here W = UDU−1 is the eigenvalue decomposition ofW, d is a column vector corresponding
to the diagonal elements of D, ¯V = U−1V are the input weights in the basis defined by the
eigenvectors of W, and ◦ is the Hadamard product. The variance of the input is denoted by
u2 = 1
3 , and ¯d is a column vector whose elements are d¯l = 1
1−dl
.
Our analytical solution for Wout assumes the true variance of the input. By appeal to the
central limit theorem, we can expect that if one calculates the memory curve for a given ESN
numerically, due to finite training size, the results should vary according to a normal distribution
around the analytical values. Previous attempts to characterize the memory curve of
ESN [15] used an annealed approximation over the Gaussian Orthogonal Ensemble (GOE) to
simplify the problem. Here, in contrast, we presented an exact solution.
การแปล กรุณารอสักครู่..
