Abstract
Reservoir computing is a recent trend in neural networks which uses dynamical perturbations
in the phase space of a system to compute a desired target function. We show one can formulate
an expectation of system performance in a simple model of reservoir computing called
echo state networks. In contrast with previous theoretical frameworks, which uses annealed
approximation, we calculate the exact optimal output weights as a function of the structure
of the system and the properties of the input and the target function. Our calculation agrees
with numerical simulations. To the best of our knowledge this work presents the first exact
analytical solution to optimal output weights in echo state networks.
Keywords: Reservoir computing, echo state networks, analytical training, Wiener filters, dynamics