During the theoretical investigations, we find that NC succeeds due to exploiting the
Ambiguity decomposition of mean squared error. We provide a grounding for NC in a
statistics context of bias, variance and covariance, including a link to a number of other
algorithms that have exploited Ambiguity. The discoveries we make regarding NC are not
limited to neural networks. The majority of observations we make are in fact properties of
the mean squared error function. We find that NC is therefore best viewed as a framework,
rather than an algorithm itself, meaning several other learning techniques could make use
of it.