It is not the case that all models that rely on distributed, overlapping representations
forget catastrophically in the presence of new information. In particular, the class of
convolution-correlation models (e.g., CHARM (Metcalfe, 1982) and TODAM (Murdock,
1983)), and Sparse Distributed Memory (SDM) (Kanerva, 1989) can learn new information
in a sequential manner and can, in addition, generalize on new input. The performance of
these models on previously learned information declines gradually, rather than falling off
abruptly, when learning new patterns. One might argue that CHARM, TODAM, and SDM
are not “connectionist” models. While, strictly speaking, this may be true, convolutioncorrelation
models are readily shown to be isomorphic to sigma-pi connectionist models and
SDM has been shown to be isomorphic to a Hopfield network (Keeler, 1988)
It is not the case that all models that rely on distributed, overlapping representationsforget catastrophically in the presence of new information. In particular, the class ofconvolution-correlation models (e.g., CHARM (Metcalfe, 1982) and TODAM (Murdock,1983)), and Sparse Distributed Memory (SDM) (Kanerva, 1989) can learn new informationin a sequential manner and can, in addition, generalize on new input. The performance ofthese models on previously learned information declines gradually, rather than falling offabruptly, when learning new patterns. One might argue that CHARM, TODAM, and SDMare not “connectionist” models. While, strictly speaking, this may be true, convolutioncorrelationmodels are readily shown to be isomorphic to sigma-pi connectionist models andSDM has been shown to be isomorphic to a Hopfield network (Keeler, 1988)
การแปล กรุณารอสักครู่..