By varying the number of pseudopatterns used to transfer information in each direction it
was found that the amount of information transferred from one memory to another could be
varied. Can the number of pseudopatterns in both directions be set very high to always
ensure maximal information transfer? One problem with doing this is convergence times.
The more pseudopatterns that need to be learned, the longer it takes the network to
converge. If there are too many pseudopatterns added, convergence may take an extremely
long time and, in some cases, may not even occur at all