Experiment 5 shows that this model, unlike standard backpropagation models, exhibits a
plausible list-length effect. This is a direct consequence of its being able to do sequential
learning in a reasonable manner, i.e., in a way that produces gradual forgetting. The list-length
effect says that as the length of a list of items to be learned gets longer, earlier items are
forgotten. While this is certainly true of standard backpropagation, it is true in a somewhat
trivial sense. As Figure 5 shows, beyond one or two items, a backpropagation network has
largely forgotten all of the previously learned items. On the other hand, the gradual forgetting
that occurs in the pseudo-recurrent networks produces a more cognitively plausible list-length
effect. In addition, this network (as well as standard backpropagation) exhibits a list-strength
effect, which is to say, that strengthening certain items by repetition not only makes these
repeated items easier to recognize, but, in addition, does not affect the network’s ability to
recognize the other non-repeated items