2 Background
In RC, a high-dimensional dynamical core called a reservoir is perturbed with an external input.
The reservoir states are then linearly combined to create the output. The readout parameters
are calculated by regression on the state of a teacher-driven reservoir and the expected output.
Unlike other forms of neural computation, computation in RC takes place within the
transient dynamics of the reservoir. The computational power of the reservoir is attributed
to a short-term memory created by the reservoir [8] and the ability to preserve the temporal
information from distinct signals over time [9]. Several studies attributed this property to the
dynamical regime of the reservoir and showed it to be optimal when the system operates in
the critical dynamical regime—a regime in which perturbations to the system’s trajectory in
its phase space neither spread nor die out [1–3, 14]. The reason for this observation remains
unknown. Maass et al. [9] proved that given the two properties of separation and approximation,
a reservoir system is capable of approximating any time series. The separation property
ensures that the reservoir perturbations from distinct signals remain distinguishable, whereas
the approximation property ensures that the output layer can approximate any function of the
reservoir states to an arbitrary degree of accuracy. Jaeger [7] proposed that an ideal reservoir
needs to have the so-called echo state property (ESP), which means that the reservoir states
asymptotically depend on the input and not the initial state of the reservoir. It has also been
suggested that the reservoir dynamics acts like a spatiotemporal kernel, projecting the input
signal onto a high-dimensional feature space [6].